Movatterモバイル変換


[0]ホーム

URL:


US7360253B2 - System and method to lock TPM always ‘on’ using a monitor - Google Patents

System and method to lock TPM always ‘on’ using a monitor
Download PDF

Info

Publication number
US7360253B2
US7360253B2US11/021,021US2102104AUS7360253B2US 7360253 B2US7360253 B2US 7360253B2US 2102104 AUS2102104 AUS 2102104AUS 7360253 B2US7360253 B2US 7360253B2
Authority
US
United States
Prior art keywords
computer
monitor
watchdog circuit
tpm
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US11/021,021
Other versions
US20060143446A1 (en
Inventor
Alexander Frank
Paul England
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft CorpfiledCriticalMicrosoft Corp
Priority to US11/021,021priorityCriticalpatent/US7360253B2/en
Assigned to MICROSOFT CORPORATIONreassignmentMICROSOFT CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: ENGLAND, PAUL, FRANK, ALEXANDER
Assigned to MICROSOFT CORPORATIONreassignmentMICROSOFT CORPORATIONASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: ENGLAND, PAUL, FRANK, ALEXANDER
Priority to PCT/US2005/046091prioritypatent/WO2006071630A2/en
Priority to EP05854752Aprioritypatent/EP1829274A4/en
Priority to BRPI0519080-0Aprioritypatent/BRPI0519080A2/en
Priority to JP2007548385Aprioritypatent/JP4945454B2/en
Priority to KR1020077012294Aprioritypatent/KR101213807B1/en
Priority to RU2007123617/09Aprioritypatent/RU2007123617A/en
Priority to CN2005800407642Aprioritypatent/CN101116070B/en
Priority to MX2007006143Aprioritypatent/MX2007006143A/en
Publication of US20060143446A1publicationCriticalpatent/US20060143446A1/en
Publication of US7360253B2publicationCriticalpatent/US7360253B2/en
Application grantedgrantedCritical
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLCreassignmentMICROSOFT TECHNOLOGY LICENSING, LLCASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS).Assignors: MICROSOFT CORPORATION
Adjusted expirationlegal-statusCritical
Expired - Fee Relatedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

A computer may be secured from attack by including a trusted environment used to verify a known monitor. The monitor may be used to determine a state of the computer for compliance to a set of conditions. The conditions may relate to terms of use, such as credits available for pay-per-use, or that the computer is running certain software, such as virus protection, or that unauthorized peripherals are not attached, or that a required token is present. The monitor may send a signal directly or through the trusted environment to a watchdog circuit. The watchdog circuit disrupts the use of the computer when the signal is not received in a given timeout period.

Description

BACKGROUND
A trusted platform module (TPM) for use in computing devices such as personal computers is known. The purpose of a TPM is to provide computer identity and secure services related to transactions, licensing of application and media, protecting user data, and special functions.
Trusted platform modules are commercially available, for example, a TPM is available from STM Microelectronics, the ST19WP18 module. The TPM stores keys and subsequently uses those keys to authenticate application programs, Basic Input/Output System (BIOS) information, or identities. However, use of the TPM is voluntary and according to current and anticipated standards and implementations cannot be used to mandate a condition on the computing device. Some business models assume the computer is out of the direct control of the computer owner/supplier, for example, a pay-per-use business model. In such an instance, circumvention of TPM services may be possible, and if circumvention occurs, may have an undesirable negative impact on the business.
SUMMARY
A trusted platform module (TPM) may be used to authenticate a monitor program that enforces conditions on a computing device. Owner keys injected or written to the TPM may be used to require that a monitor approved by the owner is operational. In turn, the approved monitor has access to resources of the TPM by way of monitor's authenticated status. Such a secure resource of the TPM may be, for example, a general purpose input/output (GPIO) port. A simple watchdog timer may be configured to reset the computer on a timed interval unless the watchdog timer is restarted within the interval period by a signal received using the GPIO.
By configuring the computer in this manner, the TPM may be used to help ensure a known monitor is running, and the watchdog timer may be used to help ensure that neither the monitor nor the TPM are disabled or tampered.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of a network interconnecting a plurality of computing resources;
FIG. 2 is a simplified and representative block diagram representative of a computer in accordance with an embodiment of the current disclosure;
FIG. 3 is a simplified and representative block diagram showing a hierarchical representation of functional layers within the computer ofFIG. 2;
FIG. 4 is a simplified and representative block diagram of a computer architecture of the computer ofFIG. 2;
FIG. 5 is a simplified and representative block diagram of an alternate computer architecture of the computer ofFIG. 2;
FIG. 6 is simplified and representative block diagram of the TPM; and
FIG. 7 is a flow chart depicting a method of locking-on a TPM using a monitor.
DETAILED DESCRIPTION
Although the following text sets forth a detailed description of numerous different embodiments, it should be understood that the legal scope of the description is defined by the words of the claims set forth at the end of this disclosure. The detailed description is to be construed as exemplary only and does not describe every possible embodiment since describing every possible embodiment would be impractical, if not impossible. Numerous alternative embodiments could be implemented, using either current technology or technology developed after the filing date of this patent, which would still fall within the scope of the claims.
It should also be understood that, unless a term is expressly defined in this patent using the sentence “As used herein, the term ‘—————’ is hereby defined to mean . . . ” or a similar sentence, there is no intent to limit the meaning of that term, either expressly or by implication, beyond its plain or ordinary meaning, and such term should not be interpreted to be limited in scope based on any statement made in any section of this patent (other than the language of the claims). To the extent that any term recited in the claims at the end of this patent is referred to in this patent in a manner consistent with a single meaning, that is done for sake of clarity only so as to not confuse the reader, and it is not intended that such claim term by limited, by implication or otherwise, to that single meaning. Finally, unless a claim element is defined by reciting the word “means” and a function without the recital of any structure, it is not intended that the scope of any claim element be interpreted based on the application of 35 U.S.C. § 112, sixth paragraph.
Much of the inventive functionality and many of the inventive principles are best implemented with or in software programs or instructions and integrated circuits (ICs) such as application specific ICs. It is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation. Therefore, in the interest of brevity and minimization of any risk of obscuring the principles and concepts in accordance to the present invention, further discussion of such software and ICs, if any, will be limited to the essentials with respect to the principles and concepts of the preferred embodiments.
FIG. 1 illustrates anetwork10 that may be used to implement a dynamic software provisioning system. Thenetwork10 may be the Internet, a virtual private network (VPN), or any other network that allows one or more computers, communication devices, databases, etc., to be communicatively connected to each other. Thenetwork10 may be connected to a personal computer12 and acomputer terminal14 via an Ethernetconnection16, arouter18, and alandline20. On the other hand, thenetwork10 may be wirelessly connected to alaptop computer22 and a personaldigital assistant24 via awireless communication station26 and awireless link28. Similarly, aserver30 may be connected to thenetwork10 using acommunication link32 and amainframe34 may be connected to thenetwork10 using anothercommunication link36.
FIG. 2 illustrates a computing device in the form of acomputer110. Components of thecomputer110 may include, but are not limited to aprocessing unit120, asystem memory130, and asystem bus121 that couples various system components including the system memory to theprocessing unit120. Thesystem bus121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus also known as Mezzanine bus.
Computer110 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed bycomputer110 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed bycomputer110. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.
Thesystem memory130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM)131 and random access memory (RAM)132. A basic input/output system133 (BIOS), containing the basic routines that help to transfer information between elements withincomputer110, such as during start-up, is typically stored inROM131.RAM132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on byprocessing unit120. By way of example, and not limitation,FIG. 2 illustrates operating system134,application programs135,other program modules136, andprogram data137.
Thecomputer110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,FIG. 2 illustrates ahard disk drive141 that reads from or writes to non-removable, nonvolatile magnetic media, amagnetic disk drive151 that reads from or writes to a removable, nonvolatilemagnetic disk152 and anoptical disk drive155 that reads from or writes to a removable, nonvolatileoptical disk156 such as a CD ROM or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not united to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. Thehard disk drive141 is typically connected to thesystem bus121 through a non-removable memory interface such asinterface140, andmagnetic disk drive151 andoptical disk drive155 are typically connected to thesystem bus121 by a removable memory interface, such asinterface150.
The drives and their associated computer storage media discussed above and illustrated inFIG. 2, provide storage of computer readable instructions, data structures, program modules and other data for thecomputer110. InFIG. 2, for example,hard disk drive141 is illustrated as storingoperating system144,application programs145,other program modules146, and program data147. Note that these components can either be the same as or different from operating system134,application programs135,other program modules136, andprogram data137.Operating system144,application programs145,other program modules146, and program data147 are given different numbers here to illustrate that, at a minimum, they are different copies. A user may enter commands and information into thecomputer20 through input devices such as akeyboard162 and pointingdevice161, commonly referred to as a mouse, trackball or touch pad. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to theprocessing unit120 through auser input interface160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB). Acathode ray tube191 or other type of display device s also connected to thesystem bus121 via an interface, such as avideo interface190. In addition to the monitor, computers may also include other peripheral output devices such asspeakers197 andprinter196, which may be connected through an outputperipheral interface190.
Thecomputer110 may operate in a networked environment using logical connections to one or more remote computers, such as aremote computer180. Theremote computer180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to thecomputer110, although only amemory storage device181 has been illustrated inFIG. 1. The logical connections depicted inFIG. 1 include a local area network (LAN)171 and a wide area network (WAN)173, but may also include other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
When used in a LAN networking environment, thecomputer110 is connected to theLAN171 through a network interface oradapter170. When used in a WAN networking environment, thecomputer110 typically includes amodem172 or other means for establishing communications over theWAN173, such as the Internet. Themodem172, which may be internal or external, may be connected to thesystem bus121 via theuser input interface160, or other appropriate mechanism. In a networked environment, program modules depicted relative to thecomputer110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,FIG. 1 illustratesremote application programs185 as residing onmemory device181.
Thecommunications connections170172 allow the device to communicate with other devices. Thecommunications connection170172 are an example of communication media. The communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. A “modulated data signal” may be a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Computer readable media may include both storage media and communication media.
The trustedplatform module125 or other trusted environment, discussed in more detail below, may store data, and keys and verify executable code and data. The trusted platform module specification states in section 4.5.2.1, “As part of system initialization, measurements of platform components and configurations will be taken. Taking measurements will not detect unsafe configurations nor will it take action to prevent continuation of the initialization process. This responsibility rests with a suitable reference monitor such as an operating system.” Because the TPM is not defined as an enforcement tool the further enhancements described below supplement the common TPM.
Awatchdog circuit126 may be configured to measure a period of time and when the time expires trigger asignal127 that disrupts the operation of thecomputer110. The disruption may be a system reset that causes thecomputer110 to reboot. The disruption may interrupt data on thesystem bus121 or a peripheral bus. To prevent thewatchdog126 from disrupting the operation of thecomputer110, a signal overcommunication connection128 may be required to reset the period of time and start the timing process again. As shown inFIG. 2, the watchdog timer reset signal may be carried overcommunication connection128. As discussed more below, theTPM125 may initiate the watchdog timer reset responsive to a signal from a monitor program. The steps described in the following may be used to help ensure that a specific, desired, monitor is present and operating by using the combination of theTPM125 andwatchdog circuit126.
FIG. 3, a simplified block diagram showing a hierarchical representation of functional layers within a representative computer such as that ofFIG. 2, is discussed and described. A trustedplatform module202 may be hardware that resides below the basic input/output structure (BIOS)204. TheTPM202 may act as a resource to the computer and higher level operations, such as theBIOS204. The BIOS may activate amonitor206. Themonitor206 resides below theoperating system208 at themonitor level210. Themonitor206 may access and use resources of theTPM202 to carry out policies associated with the operation of higher level entities. Theoperating system208 supports the major functions of thecomputer110 and may be responsible (after initial bootstrap processes hand over control) for communication, user input/output, disk and other memory access, application launch, etc. The operating system may also directly access and use theTPM202. As shown, first andsecond applications212214 may run on theoperating system208. In some cases, the monitor may enforce policies related to both theoperating system208 and theapplications212214. For example, beforeapplication214 may be launched fromdisk216, the operating system may check licensing status, depicted byline218, to determine if theapplication214 meets a given criteria for launching. The criteria for launch and subsequent metering of applications using a monitor function are discussed in more detail in US patent application “Method for Pay-As-You-Go Computer and Dynamic Differential Pricing” filed on Dec. 8, 2004 as Ser. No. 11/006,837. Briefly, themonitor206 may be used to measure and meter application programs, utilities and computer resources, for example, in a pay-per-use or pre-paid scenario.
Referring briefly toFIG. 6, theTPM202 is discussed in more detail. TheTPM202 may have aninternal memory502 comprising both volatile and non-volatile memory, at least part of which may be secure from tampering or unauthorized write operations. The memory may store anowner key504 for use in validating entities that claim affiliation with the owner for the purpose of configuring theTPM202 and for establishing trust with an outside entity. The memory may also include, among other things, a platform configuration register (PCR)506. ThePCR506 may be used to store a hash or other strong identifier associated with themonitor206. TheTPM202 may also include aclock508 andcryptographic services510. Both may be used in the authentication and authorization processes as will be discussed below in more detail. TheTPM202 may also include abus512, sometimes referred to as a Single-pin Bus or general purpose input/output (GPIO). In one embodiment, theGPIO512 may be coupled to the watchdog circuit, as described elsewhere.
TheTPM202 may also be coupled to ageneral purpose bus514 for data communication within the computer, for example, a process running themonitor206. Using thebus514, or in some cases anothermechanism516, theTPM202 may be able to measure the monitor. The measurement of the monitor may include checking a cryptographic hash of the monitor, that is, checking a hash of the memory range occupied by the monitor. The PCR may be used to store themeasurement data506. Theowner key504 may be affiliated with the hash of themonitor506, for example, by a digitally signed hash of the monitor that requires theowner key504 for confirmation. Theowner key504 may be written or injected into theTPM202 at the time of manufacture, or later, for example, at the time of delivery to a customer. Theowner key504, then, is used to authenticate themonitor206.
In an exemplary embodiment, themonitor206 is measured by a trusted module preceding it in the boot sequence, for example, by theBIOS204. The monitor measurement, such as a hash computed by theBIOS204, may be stored in theTPM PCR506 via thebus514. When theTPM202 validates the measurement (hash), theTPM202 may then allow access to themonitor206 unique keys and/or other secrets allocated to themonitor206 and stored in theTPM202. TheTPM202 will allocate to any monitor corresponding keys and secrets to whatever measurement the monitor's measurement matches.
The TPM may be programmed with anowner key504 and acorresponding monitor metric506, i.e. a hash of a knownmonitor206. The owner key is used to program or update themonitor metric506, such that only the entity in possession of theowner key504 may set the PCR register506 for the knownmonitor206. Thestandard TPM202 has a characteristic that only amonitor206 verified against a givenmeasurement506 may have control of theGPIO512. When theGPIO512 is connected in a tamper-resistant manner to thewatchdog circuit126, a chain of trust may be completed. That is, only a verifiedmonitor206 may control theGPIO512 and only theGPIO512 may be used to restart thewatchdog circuit126. Therefore, while themonitor206 may be replaced or altered, only themonitor206 verified byPCR506 set by theowner key506 may be used to restart the timer of thewatchdog circuit126. Thus only the authorized monitor may be used to prevent the watchdog from disrupting thecomputer110 by, for example, resetting, thecomputer110. The timer of thewatchdog circuit126 may be set to a period selected to allow restoration of a corrupted or tamperedcomputer110, but short enough to prevent significant useful work to be done on thecomputer110. For example, the watchdog may be set to disrupt thecomputer110 every 10-20 minutes, unless restarted by the validatedmonitor206.
Theowner secret504 and themonitor measurement506 may be programmed in a secure manufacturing environment, or may be field programmed using transport keys known to the entity programming theowner key504. Once theowner key504 is known, the programming entity, for example, a service provider, may set the measurement of the monitor that will determine what monitor is given access to the GPIO bus. Theowner key504 may be required to re-program the owner key. The use of derived keys may facilitate key distribution, scaling and protection from widespread loss should alocal owner key504 be compromised. Key management techniques are known in the data security arts.
FIG. 4 is a block diagram of a representative architecture of a computer300, the same or similar tocomputer110. The computer may have a first and second interface bridges302304. The interface bridges302304 may be connected by ahigh speed bus306. Thefirst interface bridge302 may be connected to aprocessor308,graphics controller310 andmemory312. Thememory312 may host amonitor program314, as well as other general purpose memory uses.
Thesecond interface bridge304 may be connected to peripheral buses and components, for example, universal serial bus (USB)316, Integrated Drive Electronics (IDE)318, or Peripheral Component Interconnect (PCI)320, used to connect disk drives, printers, scanners, etc. The second interface bridge may also be connected to aTPM322. As discussed above, theTPM322 may havesecure memory324 for key and hash data, and a general purpose input/output (GPIO)326. TheTPM322 may be physically or logically coupled to the monitor byconnection328. As discussed, theBIOS204 may measure themonitor206 and store the measurement in theTPM322, which allocates to themonitor314 keys and secrets corresponding to the provided measurement. Themonitor314 is therefore given access to the resources and data locked with these keys and secrets. Theconnection328 may also be used by the monitor to control theGPIO326 for the purpose of sending a signal to thewatchdog circuit330. The signal may cause the watchdog to reset. When the signal is not received by thewatchdog circuit330 in a time period proscribed by a setting in thewatchdog circuit330, a reset, or other disruptive signal may be sent overconnection332. To discourage tampering, the connection between theGPIO326 and thewatchdog circuit330 may be protected, for example, by potting or routing between circuit board layers to prevent manual restarting of thewatchdog circuit330. The computer resetsignal connection332 may be similarly protected from tampering, or at least a portion of thereset signal connection332 between thewatchdog circuit330 and the main processor computer reset point (not depicted).
FIG. 5 is a representative block diagram of an alternate architecture of the computer ofFIG. 2. Comparing to the description ofFIG. 4, like numbered components are the same. Thewatchdog circuit330 has been moved into thesecond interface bridge304 showing a representative illustration of how thewatchdog circuit330 may be combined into another circuit to improve tamper resistance. The integration of thewatchdog circuit330 to the secondinterface bridge chip304, while itself appropriate, is only illustrative. Since thesecond interface bridge304 is a major component of the computer architecture, the desired level of disruption may be carried forth from within thesecond interface bridge304. Therefore, a connection from a watchdog circuit external to thesecond interface bridge304, such asconnection332, may not be required.
In this alternate architecture, theGPIO326 may not be used to signal the reset to thewatchdog circuit330. Instead, a message may be sent overlogical connection334 directly from themonitor314 to thewatchdog circuit330.
Because a sufficient level of trust may not exist between the two entities (314330) the message may be signed using keys held in theTPM322. For example, these keys may be associated with themonitor314 during first boot (e.g. on the manufacturing line—for the sake of trustworthiness). Keys may be assigned arbitrarily, or, as mentioned above, keys may be hierarchically derived from a master key and known data such as a root certificate, serial number or manufacturing sequence number, etc. Thewatchdog timer330 may be configured to respect only messages signed using these keys, for example, during the first boot of thecomputer110 on the assembly line. In addition, the monitor locks these keys into theTPM322, such that only amonitor314 identically measured has access to these keys. A variant of this architecture is that the monitor relies on theTPM322 to allocate it these keys uniquely and respectively to its measurement.
During normal operation themonitor314 may request theTPM322 to sign on its behalf the message to be sent to thewatchdog timer330. TheTPM322 signs the message with the keys that correspond to the monitor314 (per its measurement that was stored into theTPM322 by the BIOS during each boot). Themonitor314 may receive the signed message from theTPM322 over logical connection, for example,connection328 and then provide it to thewatchdog circuit330 overlogical connection334.
When thewatchdog circuit330 receives the message, thewatchdog circuit330 may use the keys (set during manufacturing) to authenticate the message. Alternately, it may request verification using key or secret in theTPM322 usinglogical connection336. If another monitor is running, it will measure differently, resulting in different keys & secrets being allocated by the TPM. Therefore, the alternate monitor will not be able to sign the message properly such that it will be authenticated by thewatchdog circuit330. Consequently, thewatchdog circuit330 will initiate a sanction, such as firing a reset of thecomputer110 after the expiration of its timing interval. The use of signed or encrypted messages may reduce the opportunity for attacks on thelogical connections328 and334.
FIG. 7, a flowchart illustrating a method to lock a trusted platform module (TPM) always “on” using monitor, is discussed and described. A typical TPM, for example,TPM125 may be optionally enabled by the user. As described below, the method will help ensure that both theTPM125 remains enabled, and that amonitor206 selected by the owner of the business will be executed, at the risk of sanctions such as disabling thecomputer110.
Starting with application of power at thestart402, thecomputer110 may initiate the various hardware components through normal boot mechanisms. This applies to theTPM322 as well. The boot sequence may follow a Trusted Computing Platform Alliance (TCPA) methodology. The Core Root of Trust for Measurements(CRTM) (not depicted) measures theBIOS133 andstores403 its measurement into theTPM322. Then the CRTM loads and executes theBIOS133. (The CRTM may ideally be stored in a trustworthy location in thecomputer110 which is very difficult to attack).
TheBIOS133 may execute in a conventional fashion, initiating and enumerating various computer components, with one exception—it may measure each software module before loading and executing it. Also, it may store these measurements into theTPM322. Particularly, it may measure themonitor314 andstore405 the monitor measurement into theTPM322.
TheTPM322 allocates 408 keys and secrets uniquely and respectively to the monitor measurement. The essence is that theTPM322 consistently allocates 408 unique keys & secrets that correspond to a given measurement. Consequently, the secrets available to amonitor314 are unique, consistent and respective. As a result any monitor may lock resources such that will be exclusively available only to that particular monitor. For example, this enables the linking of thegenuine monitor314 to thewatchdog circuit330 by programming theGPIO326 connected to thewatchdog circuit330 to respect only the measurement associated with thegenuine monitor314. TheGPIO326 is then available only to a monitor that measures identically to thegenuine monitor314.
Regardless of whether the loaded monitor is genuine or not, the boot sequence loads and executes410 the monitor. The normal boot process may continue411 and assuming a successful boot,normal operation412 of thecomputer110 follows.
As soon as themonitor314 is loaded and executed at410 it starts its loop (413-419). First, themonitor314 sends413 a message to thewatchdog circuit330 via theTPM GPIO326. The message may signal theTPM322 to use theGPIO326 to signal thewatchdog circuit330 to restart its timer (not depicted).
After sending the message to theTPM322, the monitor returns to thetesting state414. The monitor may test414 that the state of thecomputer110 complies with a current policy. The current policy may involve the specific presence or absence of known programs, utilities or peripherals. The test may also be related to metering or other pay-per-use metrics. For example, the test may check for available provisioning packets for consumption vs. specific application program operation. In another embodiment, the test may be related to operation during a specific time period, such as calendar month.
When thetest414 fails, the No branch may be followed416, where the monitor acts in accordance with the policy. The action may be just a warning code sent to the operating system or a warning message presented to user. The action may be some sanction imposed on the operating system and user, e.g. limiting or eliminating a certain function of the computer. This may apply to hardware and/or software functions. For instance, the computer may be slowed down, certain software may be disabled, or certain devices may be disabled, e.g. a webcam. More severe sanctions may be to limit the amount of RAM available to the OS, or to reduce the Instruction-Set-Architecture available to the operating system. In an exemplary embodiment, one course of action available to themonitor314 when a non-compliant condition is found may be to not take action to restart the timer of thewatchdog circuit330 and let thewatchdog circuit330 impose a sanction.
When the test succeeds, the Yes branch from414 may be followed. In either case, execution waits419 for an interval before returning to step413. The wait interval avoids exhausting the computer's resources by repeatedly running themonitor314. Obviously, thiswait interval419 should be some fraction of the watchdog timer counting period. The determination of a usable fraction may be the likelihood that normal operation of the computer would delay execution completion of the loop. Then the loop returns to step413 discussed above. The period for repeating the loop may be set to any time less than the watchdog circuit timeout period, otherwise an unwarranted disruption may take place.
When theTPM322 receives420 the message, theTPM322 acts according to the monitor measurement. If the measurement is deemednon-genuine420 fails, the No branch may be taken tobox422, which takes no action, i.e. the signal to thewatchdog circuit330 is not sent. No further action may be needed by theTPM322 because thewatchdog circuit330 will disrupt thecomputer110 unless steps are taken to stop it. Optionally, theTPM322 may, at422, generate an error for logging generate a warning/error code, notify the operating system and may display a message to the user.
When theTPM322 verifies that the monitor measurement is genuine, theGPIO326 may be activated to signal424 thewatchdog circuit330 to restart its timer. As discussed above, restarting the watchdog circuit timer prevents thewatchdog circuit330 from initiating a disruptive action, such as a reset of thecomputer110. Thewatchdog circuit330 may then restart426 the timer at its initial value. The timer will then count428 and test430 for expiration of a pre-determined time. The timer period may be settable. Timer implementation is known and whether the timer counts up to a given number, down to zero, counts to a set clock time, or other mechanism, is a design choice.
If the timer has not expired, the no branch from430 may be taken back to428, which will take another count from the timer. When time has expired, the yes branch from430 may be taken and the watchdog may enforce a sanction by disrupting432 the computer. The disruption may be a system reset, causing a re-boot, disabling of peripherals, etc. The period for the watchdog circuit timer to count down to adisruption432 may be enough to allow a user to correct a non-compliant condition on thecomputer110, but should be frequent enough to restrict reliable or useful activity on thecomputer110.
The link from432 to426 may be conceptual. If the disruption is implemented by a reset of the whole computer, this link is moot. In the event of a more subtle disruption, e.g. slowing the computer down, this link is used to restart the count down and may result in a more disabling disruption, for example, cause a reset.
It can be seen that two purposes of the owner of a business associated with supplying computers on a pay-per-use or other underwriter may be accomplished by the above method. First, if theTPM322 is disabled because the user opted out of using theTPM322 or hacked the computer to disable theTPM322, messages to thewatchdog circuit330 will not be generated and thecomputer110 will be disrupted.
Similarly, if theTPM322 is enabled and operational, but the monitor is altered or replaced, possibly to alter or ignore the policies in effect (e.g. usage policies), the TPM will not honor the monitor requests. Practically, an altered monitor measurement is different than the measurement of the genuine monitor. Consequently, when the monitor measurement is stored into theTPM322, it will allocate a set of keys and secrets respective and unique to the altered monitor, and different from those needed for operation of theGPIO326. As a result any message from the altered monitor to the TPM to signal theGPIO326 will not be honored. Therefore, thewatchdog circuit330 will not receive restart signals and thecomputer110 will be disrupted.
In both cases, theTPM322 must be enabled and thegenuine monitor314 must be in place and operational for correct operation of thecomputer110.
Other uses for the above method and apparatus may be envisioned. For example, part of the boot process may require presentation of credentials by an authorized user. If correct credentials are not presented, the boot process may not load the genuine monitor, which will ultimately result in the disabling of thecomputer110.

Claims (9)

US11/021,0212004-12-232004-12-23System and method to lock TPM always ‘on’ using a monitorExpired - Fee RelatedUS7360253B2 (en)

Priority Applications (9)

Application NumberPriority DateFiling DateTitle
US11/021,021US7360253B2 (en)2004-12-232004-12-23System and method to lock TPM always ‘on’ using a monitor
EP05854752AEP1829274A4 (en)2004-12-232005-12-20 SYSTEM AND METHOD FOR LOCKING A TRUSTED PLATFORM MODULE ALWAYS "ON" BY MEANS OF A MONITORING DEVICE
KR1020077012294AKR101213807B1 (en)2004-12-232005-12-20System and method to lock tpm always 'on' using a monitor
MX2007006143AMX2007006143A (en)2004-12-232005-12-20System and method to lock tpm always 'on' using a monitor.
BRPI0519080-0ABRPI0519080A2 (en)2004-12-232005-12-20 system and method to block an always-on tpm using a monitor
JP2007548385AJP4945454B2 (en)2004-12-232005-12-20 Method and system for locking the TPM always "on" using a monitor
PCT/US2005/046091WO2006071630A2 (en)2004-12-232005-12-20System and method to lock tpm always 'on' using a monitor
RU2007123617/09ARU2007123617A (en)2004-12-232005-12-20 SYSTEM AND METHOD OF LOCKING THE TRM MODULE "ALWAYS ON", USING THE MONITOR
CN2005800407642ACN101116070B (en)2004-12-232005-12-20System and method to lock TPM always 'on' using a monitor

Applications Claiming Priority (1)

Application NumberPriority DateFiling DateTitle
US11/021,021US7360253B2 (en)2004-12-232004-12-23System and method to lock TPM always ‘on’ using a monitor

Publications (2)

Publication NumberPublication Date
US20060143446A1 US20060143446A1 (en)2006-06-29
US7360253B2true US7360253B2 (en)2008-04-15

Family

ID=36613166

Family Applications (1)

Application NumberTitlePriority DateFiling Date
US11/021,021Expired - Fee RelatedUS7360253B2 (en)2004-12-232004-12-23System and method to lock TPM always ‘on’ using a monitor

Country Status (9)

CountryLink
US (1)US7360253B2 (en)
EP (1)EP1829274A4 (en)
JP (1)JP4945454B2 (en)
KR (1)KR101213807B1 (en)
CN (1)CN101116070B (en)
BR (1)BRPI0519080A2 (en)
MX (1)MX2007006143A (en)
RU (1)RU2007123617A (en)
WO (1)WO2006071630A2 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20070226518A1 (en)*2006-03-222007-09-27Fujitsu LimitedInformation processing device having activation verification function
US20080104701A1 (en)*2006-05-222008-05-01Eric PeacockSystem and method for secure operating system boot
US20080320312A1 (en)*2007-06-212008-12-25Microsoft CorporationHardware-Based Computer Theft Deterrence
US20100212021A1 (en)*2009-02-182010-08-19Harris Technology, LlcDecrement software
US20100235888A1 (en)*2009-03-162010-09-16Konica Minolta Business Technologies, Inc.Image forming apparatus, function extending method and user authentication system
CN102063593A (en)*2011-01-072011-05-18北京工业大学Credible device with active control function and authentication method thereof
EP2393007A1 (en)2010-06-032011-12-07Telefonaktiebolaget L M Ericsson AB (Publ)Processing device
US8176564B2 (en)2004-11-152012-05-08Microsoft CorporationSpecial PC mode entered upon detection of undesired state
US8336085B2 (en)2004-11-152012-12-18Microsoft CorporationTuning product policy using observed evidence of customer behavior
US8347078B2 (en)2004-10-182013-01-01Microsoft CorporationDevice certificate individualization
US8353046B2 (en)2005-06-082013-01-08Microsoft CorporationSystem and method for delivery of a modular operating system
US8375221B1 (en)2011-07-292013-02-12Microsoft CorporationFirmware-based trusted platform module for arm processor architectures and trustzone security extensions
US8438645B2 (en)2005-04-272013-05-07Microsoft CorporationSecure clock with grace periods
US8464348B2 (en)2004-11-152013-06-11Microsoft CorporationIsolated computing environment anchored into CPU and motherboard
US8700535B2 (en)2003-02-252014-04-15Microsoft CorporationIssuing a publisher use license off-line in a digital rights management (DRM) system
US8725646B2 (en)2005-04-152014-05-13Microsoft CorporationOutput protection levels
US8781969B2 (en)2005-05-202014-07-15Microsoft CorporationExtensible media rights
US20150220927A1 (en)*2013-09-252015-08-06Ned M. SmithMethod, apparatus and system for providing transaction indemnification
US9189605B2 (en)2005-04-222015-11-17Microsoft Technology Licensing, LlcProtected computing environment
US20150365231A1 (en)*2014-06-122015-12-17Nxp B.V.Method for configuring a secure element, key derivation program, computer program product and configurable secure element
US9363481B2 (en)2005-04-222016-06-07Microsoft Technology Licensing, LlcProtected media pipeline
US9436804B2 (en)2005-04-222016-09-06Microsoft Technology Licensing, LlcEstablishing a unique session key using a hardware functionality scan
US9612893B2 (en)2015-05-112017-04-04Silicon Laboratories Inc.Peripheral watchdog timer
US10460106B2 (en)2015-02-062019-10-29Alibaba Group Holding LimitedMethod and device for identifying computer virus variants
US11977635B2 (en)2020-05-272024-05-07Basler AktiengesellschaftProtection of computer systems against manipulation and functional anomalies

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US7908483B2 (en)*2005-06-302011-03-15Intel CorporationMethod and apparatus for binding TPM keys to execution entities
US20070168574A1 (en)*2005-09-282007-07-19Dell Products L.P.System and method for securing access to general purpose input/output ports in a computer system
JP2007242207A (en)*2006-03-132007-09-20Fujitsu Ltd Medium scanning method for disk device
US7984283B2 (en)*2006-05-222011-07-19Hewlett-Packard Development Company, L.P.System and method for secure operating system boot
JP4048382B1 (en)*2006-09-012008-02-20富士ゼロックス株式会社 Information processing system and program
US20080077420A1 (en)*2006-09-272008-03-27Daryl CromerSystem and Method for Securely Updating Remaining Time or Subscription Data for a Rental Computer
US7971056B2 (en)*2006-12-182011-06-28Microsoft CorporationDirect memory access for compliance checking
US20080147555A1 (en)*2006-12-182008-06-19Daryl Carvis CromerSystem and Method for Using a Hypervisor to Control Access to a Rental Computer
US7631169B2 (en)*2007-02-022009-12-08International Business Machines CorporationFault recovery on a massively parallel computer system to handle node failures without ending an executing job
US9805196B2 (en)2009-02-272017-10-31Microsoft Technology Licensing, LlcTrusted entity based anti-cheating mechanism
CN101984575B (en)*2010-10-142015-06-03中兴通讯股份有限公司Method and device for protecting mobile terminal software
US9256734B2 (en)*2012-04-272016-02-09Broadcom CorporationSecurity controlled multi-processor system
WO2013166278A1 (en)*2012-05-022013-11-07Visa International Service AssociationSmall form-factor cryptographic expansion device
US9633210B2 (en)*2013-09-132017-04-25Microsoft Technology Licensing, LlcKeying infrastructure
US9542568B2 (en)*2013-09-252017-01-10Max Planck Gesellschaft Zur Foerderung Der Wissenschaften E.V.Systems and methods for enforcing third party oversight of data anonymization
US10097513B2 (en)2014-09-142018-10-09Microsoft Technology Licensing, LlcTrusted execution environment extensible computing device interface
US20170116432A1 (en)*2015-01-222017-04-27Daniel MinoliSystem and methods for cyber-and-physically-secure high grade weaponry
EP3270321B1 (en)*2016-07-142020-02-19Kontron Modular Computers SASTechnique for securely performing an operation in an iot environment
US10402566B2 (en)*2016-08-012019-09-03The Aerospace CorporationHigh assurance configuration security processor (HACSP) for computing devices
CN111279343B (en)*2017-08-162024-07-02惠普发展公司,有限责任合伙企业 Storage device monitoring
US10659054B2 (en)*2018-02-232020-05-19Nxp B.V.Trusted monotonic counter using internal and external non-volatile memory
JP7322233B2 (en)*2018-06-262023-08-07キヤノン株式会社 Information processing device and tampering detection method for detecting tampering of software executed at startup
JP7059127B2 (en)*2018-06-262022-04-25キヤノン株式会社 Information processing device that detects tampering with software executed at startup and its control method
US10965551B2 (en)*2018-11-212021-03-30Microsoft Technology Licensing, LlcSecure count in cloud computing networks
US11232217B2 (en)*2018-12-062022-01-25Oracle International CorporationManaging a security policy for a device
US11316694B2 (en)*2019-03-272022-04-26Microsoft Technology Licensing, LlcCryptographic hardware watchdog
JP7522547B2 (en)2019-09-202024-07-25キヤノン株式会社 Information processing device and reset control method

Citations (59)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4817094A (en)*1986-12-311989-03-28International Business Machines CorporationFault tolerant switch with selectable operating modes
US4855922A (en)*1987-03-201989-08-08Scientific-Atlanta, Inc.Apparatus and method for monitoring an energy management system
US5522040A (en)*1990-12-101996-05-28Robert Bosch GmbhArrangement for testing a watchdog circuit
US5563799A (en)*1994-11-101996-10-08United Technologies Automotive, Inc.Low cost/low current watchdog circuit for microprocessor
US20020023212A1 (en)*2000-08-182002-02-21Hewlett-Packard CompanyPerformance of a service on a computing platform
US6385727B1 (en)*1998-09-252002-05-07Hughes Electronics CorporationApparatus for providing a secure processing environment
US6408170B1 (en)*1997-10-082002-06-18U.S. Philips CorporationControl circuit for a microcontroller
US20020124212A1 (en)*1997-03-242002-09-05Werner NitschkeWatchdog circuit
US20020184482A1 (en)*2001-05-312002-12-05John LacombeApplication-level software watchdog timer
US20030084337A1 (en)*2001-10-032003-05-01Simionescu Dan C.Remotely controlled failsafe boot mechanism and manager for a network device
US20030084285A1 (en)*2001-10-262003-05-01International Business Machines CorporationMethod and system for detecting a tamper event in a trusted computing environment
US20030126519A1 (en)*2001-12-282003-07-03Kresimir OdorcicMethod and apparatus for controlling an electronic control
US20030188165A1 (en)*2002-03-292003-10-02Sutton James A.System and method for execution of a secured environment initialization instruction
US20040093508A1 (en)*2002-08-032004-05-13Dirk FoerstnerMethod for monitoring a microprocessor and circuit arrangement having a microprocessor
US20040199769A1 (en)*2003-04-072004-10-07Proudler Graeme JohnProvision of commands to computing apparatus
US20050028000A1 (en)*2003-07-282005-02-03Mallik BulusuMethod and apparatus for trusted blade device computing
US20050039013A1 (en)*2003-08-112005-02-17Bajikar Sundeep M.Method and system for authenticating a user of a computer system that has a trusted platform module (TPM)
US6871283B1 (en)*1990-02-132005-03-22Hewlett-Packard Development Company, L.P.Processing trusted commands in trusted and untrusted environments
US20050108564A1 (en)*2003-11-132005-05-19International Business Machines CorporationReducing the boot time of a TCPA based computing system when the Core Root of Trust Measurement is embedded in the boot block code
US20050138389A1 (en)*2003-12-232005-06-23International Business Machines CorporationSystem and method for making password token portable in trusted platform module (TPM)
US20050138370A1 (en)*2003-12-232005-06-23Goud Gundrala D.Method and system to support a trusted set of operational environments using emulated trusted hardware
US20050141717A1 (en)*2003-12-302005-06-30International Business Machines CorporationApparatus, system, and method for sealing a data repository to a trusted computing platform
US20050166051A1 (en)*2004-01-262005-07-28Mark BuerSystem and method for certification of a secure platform
US20050216577A1 (en)*2004-03-242005-09-29Durham David MCooperative embedded agents
US20050221766A1 (en)*2004-03-312005-10-06Brizek John PMethod and apparatus to perform dynamic attestation
US20050235141A1 (en)*2004-04-192005-10-20Hewlett-Packard Development Company, L.P.Subordinate trusted platform module
US20050246521A1 (en)*2004-04-292005-11-03International Business Machines CorporationMethod and system for providing a trusted platform module in a hypervisor environment
US20050246525A1 (en)*2004-04-292005-11-03International Business Machines CorporationMethod and system for hierarchical platform boot measurements in a trusted computing environment
US20050246552A1 (en)*2004-04-292005-11-03International Business Machines CorporationMethod and system for virtualization of trusted platform modules
US20050257073A1 (en)*2004-04-292005-11-17International Business Machines CorporationMethod and system for bootstrapping a trusted server having redundant trusted platform modules
US6986042B2 (en)*2000-08-182006-01-10Hewlett-Packard Development Company, L.P.Computer system operable to revert to a trusted state
US20060010326A1 (en)*2004-07-082006-01-12International Business Machines CorporationMethod for extending the CRTM in a trusted platform
US20060015732A1 (en)*2004-07-152006-01-19Sony CorporationProcessing system using internal digital signatures
US20060015717A1 (en)*2004-07-152006-01-19Sony Corporation And Sony Electronics, Inc.Establishing a trusted platform in a digital processing system
US20060015718A1 (en)*2004-07-152006-01-19Sony CorporationUse of kernel authorization data to maintain security in a digital processing system
US20060026418A1 (en)*2004-07-292006-02-02International Business Machines CorporationMethod, apparatus, and product for providing a multi-tiered trust architecture
US20060026422A1 (en)*2004-07-292006-02-02International Business Machines CorporationMethod, apparatus, and product for providing a backup hardware trusted platform module in a hypervisor environment
US20060026419A1 (en)*2004-07-292006-02-02International Business Machines CorporationMethod, apparatus, and product for providing a scalable trusted platform module in a hypervisor environment
US7000829B1 (en)*2002-07-162006-02-21Diebold, IncorporatedAutomated banking machine key loading system and method
US7013384B2 (en)*2002-01-152006-03-14Lenovo (Singapore) Pte. Ltd.Computer system with selectively available immutable boot block code
US20060072762A1 (en)*2004-10-012006-04-06Mark BuerStateless hardware security module
US20060072748A1 (en)*2004-10-012006-04-06Mark BuerCMOS-based stateless hardware security module
US20060075223A1 (en)*2004-10-012006-04-06International Business Machines CorporationScalable paging of platform configuration registers
US7028149B2 (en)*2002-03-292006-04-11Intel CorporationSystem and method for resetting a platform configuration register
US20060085844A1 (en)*2004-10-202006-04-20Mark BuerUser authentication system
US20060085637A1 (en)*2004-10-152006-04-20Binyamin PinkasAuthentication system and method
US20060090084A1 (en)*2004-10-222006-04-27Mark BuerSecure processing environment
US20060100010A1 (en)*2002-07-052006-05-11Cyberscan Technology, Inc.Secure game download
US20060112267A1 (en)*2004-11-232006-05-25Zimmer Vincent JTrusted platform storage controller
US20060117177A1 (en)*2004-11-292006-06-01Buer Mark LProgrammable security platform
US20060129824A1 (en)*2004-12-152006-06-15Hoff James PSystems, methods, and media for accessing TPM keys
US20060136717A1 (en)*2004-12-202006-06-22Mark BuerSystem and method for authentication via a proximate device
US20060143431A1 (en)*2004-12-212006-06-29Intel CorporationMethod to provide autonomic boot recovery
US7121460B1 (en)*2002-07-162006-10-17Diebold Self-Service Systems Division Of Diebold, IncorporatedAutomated banking machine component authentication system and method
US7127579B2 (en)*2002-03-262006-10-24Intel CorporationHardened extended firmware interface framework
US7130951B1 (en)*2002-04-182006-10-31Advanced Micro Devices, Inc.Method for selectively disabling interrupts on a secure execution mode-capable processor
US7171539B2 (en)*2002-11-182007-01-30Arm LimitedApparatus and method for controlling access to a memory
US7207039B2 (en)*2003-12-242007-04-17Intel CorporationSecure booting and provisioning
US7236455B1 (en)*1999-02-152007-06-26Hewlett-Packard Development Company, L.P.Communications between modules of a computing apparatus

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
JPH0635718A (en)*1992-07-151994-02-10Matsushita Electric Works LtdSystem degradation system at the time of system abnormality
CN1153348A (en)*1995-12-251997-07-02合泰半导体股份有限公司Flag setting circuit for microprocessor
CN1107920C (en)*1998-11-272003-05-07中国科学院空间科学与应用研究中心General data acquisition unit and its operating method
US6874087B1 (en)*1999-07-132005-03-29International Business Machines CorporationIntegrity checking an executable module and associated protected service provider module
EP1076279A1 (en)*1999-08-132001-02-14Hewlett-Packard CompanyComputer platforms and their methods of operation
JP2001101033A (en)*1999-09-272001-04-13Hitachi Ltd Fault monitoring method for operating system and application program
JP2003208314A (en)*2002-01-152003-07-25Mitsubishi Electric Corp Computer system capable of automatically replacing an operating system and method for automatically replacing an operation system using the computer system
EP1429224A1 (en)2002-12-102004-06-16Texas Instruments IncorporatedFirmware run-time authentication
CN2599652Y (en)*2002-12-042004-01-14华为技术有限公司 A Watchdog Clearing Dog Circuit
TWI319147B (en)*2003-04-102010-01-01Lenovo Singapore Pte LtdApparatus, motherboard, method and computer-readable storage medium recording instructions capable of determinging physical presence in a trusted platform in a computer system

Patent Citations (63)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US4817094A (en)*1986-12-311989-03-28International Business Machines CorporationFault tolerant switch with selectable operating modes
US4855922A (en)*1987-03-201989-08-08Scientific-Atlanta, Inc.Apparatus and method for monitoring an energy management system
US6871283B1 (en)*1990-02-132005-03-22Hewlett-Packard Development Company, L.P.Processing trusted commands in trusted and untrusted environments
US5522040A (en)*1990-12-101996-05-28Robert Bosch GmbhArrangement for testing a watchdog circuit
US5563799A (en)*1994-11-101996-10-08United Technologies Automotive, Inc.Low cost/low current watchdog circuit for microprocessor
US20020124212A1 (en)*1997-03-242002-09-05Werner NitschkeWatchdog circuit
US6408170B1 (en)*1997-10-082002-06-18U.S. Philips CorporationControl circuit for a microcontroller
US6385727B1 (en)*1998-09-252002-05-07Hughes Electronics CorporationApparatus for providing a secure processing environment
US7236455B1 (en)*1999-02-152007-06-26Hewlett-Packard Development Company, L.P.Communications between modules of a computing apparatus
US6986042B2 (en)*2000-08-182006-01-10Hewlett-Packard Development Company, L.P.Computer system operable to revert to a trusted state
US20020023212A1 (en)*2000-08-182002-02-21Hewlett-Packard CompanyPerformance of a service on a computing platform
US20020184482A1 (en)*2001-05-312002-12-05John LacombeApplication-level software watchdog timer
US7000100B2 (en)*2001-05-312006-02-14Hewlett-Packard Development Company, L.P.Application-level software watchdog timer
US20040255000A1 (en)*2001-10-032004-12-16Simionescu Dan C.Remotely controlled failsafe boot mechanism and remote manager for a network device
US20030084337A1 (en)*2001-10-032003-05-01Simionescu Dan C.Remotely controlled failsafe boot mechanism and manager for a network device
US20030084285A1 (en)*2001-10-262003-05-01International Business Machines CorporationMethod and system for detecting a tamper event in a trusted computing environment
US20030126519A1 (en)*2001-12-282003-07-03Kresimir OdorcicMethod and apparatus for controlling an electronic control
US7013384B2 (en)*2002-01-152006-03-14Lenovo (Singapore) Pte. Ltd.Computer system with selectively available immutable boot block code
US7127579B2 (en)*2002-03-262006-10-24Intel CorporationHardened extended firmware interface framework
US20030188165A1 (en)*2002-03-292003-10-02Sutton James A.System and method for execution of a secured environment initialization instruction
US7028149B2 (en)*2002-03-292006-04-11Intel CorporationSystem and method for resetting a platform configuration register
US7069442B2 (en)*2002-03-292006-06-27Intel CorporationSystem and method for execution of a secured environment initialization instruction
US20050182940A1 (en)*2002-03-292005-08-18Sutton James A.IiSystem and method for execution of a secured environment initialization instruction
US7130951B1 (en)*2002-04-182006-10-31Advanced Micro Devices, Inc.Method for selectively disabling interrupts on a secure execution mode-capable processor
US20060100010A1 (en)*2002-07-052006-05-11Cyberscan Technology, Inc.Secure game download
US7000829B1 (en)*2002-07-162006-02-21Diebold, IncorporatedAutomated banking machine key loading system and method
US7121460B1 (en)*2002-07-162006-10-17Diebold Self-Service Systems Division Of Diebold, IncorporatedAutomated banking machine component authentication system and method
US20040093508A1 (en)*2002-08-032004-05-13Dirk FoerstnerMethod for monitoring a microprocessor and circuit arrangement having a microprocessor
US7171539B2 (en)*2002-11-182007-01-30Arm LimitedApparatus and method for controlling access to a memory
US20040199769A1 (en)*2003-04-072004-10-07Proudler Graeme JohnProvision of commands to computing apparatus
US20050028000A1 (en)*2003-07-282005-02-03Mallik BulusuMethod and apparatus for trusted blade device computing
US20050039013A1 (en)*2003-08-112005-02-17Bajikar Sundeep M.Method and system for authenticating a user of a computer system that has a trusted platform module (TPM)
US20050108564A1 (en)*2003-11-132005-05-19International Business Machines CorporationReducing the boot time of a TCPA based computing system when the Core Root of Trust Measurement is embedded in the boot block code
US20050138389A1 (en)*2003-12-232005-06-23International Business Machines CorporationSystem and method for making password token portable in trusted platform module (TPM)
US20050138370A1 (en)*2003-12-232005-06-23Goud Gundrala D.Method and system to support a trusted set of operational environments using emulated trusted hardware
US7207039B2 (en)*2003-12-242007-04-17Intel CorporationSecure booting and provisioning
US20050141717A1 (en)*2003-12-302005-06-30International Business Machines CorporationApparatus, system, and method for sealing a data repository to a trusted computing platform
US20050166051A1 (en)*2004-01-262005-07-28Mark BuerSystem and method for certification of a secure platform
US20050216577A1 (en)*2004-03-242005-09-29Durham David MCooperative embedded agents
US20050221766A1 (en)*2004-03-312005-10-06Brizek John PMethod and apparatus to perform dynamic attestation
US20050235141A1 (en)*2004-04-192005-10-20Hewlett-Packard Development Company, L.P.Subordinate trusted platform module
US20050246552A1 (en)*2004-04-292005-11-03International Business Machines CorporationMethod and system for virtualization of trusted platform modules
US20050257073A1 (en)*2004-04-292005-11-17International Business Machines CorporationMethod and system for bootstrapping a trusted server having redundant trusted platform modules
US20050246521A1 (en)*2004-04-292005-11-03International Business Machines CorporationMethod and system for providing a trusted platform module in a hypervisor environment
US20050246525A1 (en)*2004-04-292005-11-03International Business Machines CorporationMethod and system for hierarchical platform boot measurements in a trusted computing environment
US20060010326A1 (en)*2004-07-082006-01-12International Business Machines CorporationMethod for extending the CRTM in a trusted platform
US20060015718A1 (en)*2004-07-152006-01-19Sony CorporationUse of kernel authorization data to maintain security in a digital processing system
US20060015732A1 (en)*2004-07-152006-01-19Sony CorporationProcessing system using internal digital signatures
US20060015717A1 (en)*2004-07-152006-01-19Sony Corporation And Sony Electronics, Inc.Establishing a trusted platform in a digital processing system
US20060026422A1 (en)*2004-07-292006-02-02International Business Machines CorporationMethod, apparatus, and product for providing a backup hardware trusted platform module in a hypervisor environment
US20060026419A1 (en)*2004-07-292006-02-02International Business Machines CorporationMethod, apparatus, and product for providing a scalable trusted platform module in a hypervisor environment
US20060026418A1 (en)*2004-07-292006-02-02International Business Machines CorporationMethod, apparatus, and product for providing a multi-tiered trust architecture
US20060075223A1 (en)*2004-10-012006-04-06International Business Machines CorporationScalable paging of platform configuration registers
US20060072748A1 (en)*2004-10-012006-04-06Mark BuerCMOS-based stateless hardware security module
US20060072762A1 (en)*2004-10-012006-04-06Mark BuerStateless hardware security module
US20060085637A1 (en)*2004-10-152006-04-20Binyamin PinkasAuthentication system and method
US20060085844A1 (en)*2004-10-202006-04-20Mark BuerUser authentication system
US20060090084A1 (en)*2004-10-222006-04-27Mark BuerSecure processing environment
US20060112267A1 (en)*2004-11-232006-05-25Zimmer Vincent JTrusted platform storage controller
US20060117177A1 (en)*2004-11-292006-06-01Buer Mark LProgrammable security platform
US20060129824A1 (en)*2004-12-152006-06-15Hoff James PSystems, methods, and media for accessing TPM keys
US20060136717A1 (en)*2004-12-202006-06-22Mark BuerSystem and method for authentication via a proximate device
US20060143431A1 (en)*2004-12-212006-06-29Intel CorporationMethod to provide autonomic boot recovery

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"TCG Specification Architecture Overview," publication dated Apr. 28, 2004, 54 pages.

Cited By (37)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US8719171B2 (en)2003-02-252014-05-06Microsoft CorporationIssuing a publisher use license off-line in a digital rights management (DRM) system
US8700535B2 (en)2003-02-252014-04-15Microsoft CorporationIssuing a publisher use license off-line in a digital rights management (DRM) system
US8347078B2 (en)2004-10-182013-01-01Microsoft CorporationDevice certificate individualization
US9336359B2 (en)2004-10-182016-05-10Microsoft Technology Licensing, LlcDevice certificate individualization
US9224168B2 (en)2004-11-152015-12-29Microsoft Technology Licensing, LlcTuning product policy using observed evidence of customer behavior
US8176564B2 (en)2004-11-152012-05-08Microsoft CorporationSpecial PC mode entered upon detection of undesired state
US8336085B2 (en)2004-11-152012-12-18Microsoft CorporationTuning product policy using observed evidence of customer behavior
US8464348B2 (en)2004-11-152013-06-11Microsoft CorporationIsolated computing environment anchored into CPU and motherboard
US8725646B2 (en)2005-04-152014-05-13Microsoft CorporationOutput protection levels
US9436804B2 (en)2005-04-222016-09-06Microsoft Technology Licensing, LlcEstablishing a unique session key using a hardware functionality scan
US9363481B2 (en)2005-04-222016-06-07Microsoft Technology Licensing, LlcProtected media pipeline
US9189605B2 (en)2005-04-222015-11-17Microsoft Technology Licensing, LlcProtected computing environment
US8438645B2 (en)2005-04-272013-05-07Microsoft CorporationSecure clock with grace periods
US8781969B2 (en)2005-05-202014-07-15Microsoft CorporationExtensible media rights
US8353046B2 (en)2005-06-082013-01-08Microsoft CorporationSystem and method for delivery of a modular operating system
US8433923B2 (en)*2006-03-222013-04-30Fujitsu LimitedInformation processing device having activation verification function
US20070226518A1 (en)*2006-03-222007-09-27Fujitsu LimitedInformation processing device having activation verification function
US20080104701A1 (en)*2006-05-222008-05-01Eric PeacockSystem and method for secure operating system boot
US8122258B2 (en)*2006-05-222012-02-21Hewlett-Packard Development Company, L.P.System and method for secure operating system boot
US20080320312A1 (en)*2007-06-212008-12-25Microsoft CorporationHardware-Based Computer Theft Deterrence
US8522043B2 (en)*2007-06-212013-08-27Microsoft CorporationHardware-based computer theft deterrence
US20100212021A1 (en)*2009-02-182010-08-19Harris Technology, LlcDecrement software
US8151362B2 (en)*2009-03-162012-04-03Konica Minolta Business Technologies, Inc.Image forming apparatus, function extending method and user authentication system
US20100235888A1 (en)*2009-03-162010-09-16Konica Minolta Business Technologies, Inc.Image forming apparatus, function extending method and user authentication system
WO2011151211A1 (en)2010-06-032011-12-08Telefonaktiebolaget L M Ericsson (Publ)Processing device
US9588776B2 (en)2010-06-032017-03-07Telefonaktiebolaget Lm Ericsson (Publ)Processing device
EP2393007A1 (en)2010-06-032011-12-07Telefonaktiebolaget L M Ericsson AB (Publ)Processing device
CN102063593A (en)*2011-01-072011-05-18北京工业大学Credible device with active control function and authentication method thereof
CN102063593B (en)*2011-01-072013-01-09北京工业大学Credible device with active control function and authentication method thereof
US8375221B1 (en)2011-07-292013-02-12Microsoft CorporationFirmware-based trusted platform module for arm processor architectures and trustzone security extensions
US9489512B2 (en)2011-07-292016-11-08Microsoft Technology Licensing, LlcTrustzone-based integrity measurements and verification using a software-based trusted platform module
US20150220927A1 (en)*2013-09-252015-08-06Ned M. SmithMethod, apparatus and system for providing transaction indemnification
US20150365231A1 (en)*2014-06-122015-12-17Nxp B.V.Method for configuring a secure element, key derivation program, computer program product and configurable secure element
US10460106B2 (en)2015-02-062019-10-29Alibaba Group Holding LimitedMethod and device for identifying computer virus variants
US11126717B2 (en)2015-02-062021-09-21Banma Zhixing Network (Hong Kong) Co., LimitedTechniques for identifying computer virus variant
US9612893B2 (en)2015-05-112017-04-04Silicon Laboratories Inc.Peripheral watchdog timer
US11977635B2 (en)2020-05-272024-05-07Basler AktiengesellschaftProtection of computer systems against manipulation and functional anomalies

Also Published As

Publication numberPublication date
RU2007123617A (en)2008-12-27
KR101213807B1 (en)2012-12-18
BRPI0519080A2 (en)2008-12-23
JP4945454B2 (en)2012-06-06
CN101116070A (en)2008-01-30
WO2006071630A3 (en)2007-08-02
JP2008525892A (en)2008-07-17
US20060143446A1 (en)2006-06-29
EP1829274A4 (en)2012-01-18
EP1829274A2 (en)2007-09-05
CN101116070B (en)2010-06-09
WO2006071630A2 (en)2006-07-06
MX2007006143A (en)2007-07-19
KR20070097031A (en)2007-10-02

Similar Documents

PublicationPublication DateTitle
US7360253B2 (en)System and method to lock TPM always ‘on’ using a monitor
US9189605B2 (en)Protected computing environment
US7565553B2 (en)Systems and methods for controlling access to data on a computer with a secure boot process
JP4981051B2 (en) Change product behavior according to license
US7984283B2 (en)System and method for secure operating system boot
EP2854066B1 (en)System and method for firmware integrity verification using multiple keys and OTP memory
KR20070084257A (en) Isolated computing environment secured to CPUs and motherboards
US20050132217A1 (en)Secure and backward-compatible processor and secure software execution thereon
KR20070084259A (en) Systems and Methods for Programming Isolated Computing Environments
KR20070102489A (en) Final line of defense to ensure and enforce sufficiently valid / current code
US10936722B2 (en)Binding of TPM and root device
KR20070084258A (en) Special PC mode to enter when an unwanted condition is detected
US11347858B2 (en)System and method to inhibit firmware downgrade
WO2025139716A1 (en)Firmware execution method, device and system, storage medium, and electronic device
WO2006115533A2 (en)Protected computing environment
CN101189615B (en) Method for establishing and maintaining a protected computing environment
WO2024078159A1 (en)Integrity measurement method and apparatus
US20080184026A1 (en)Metered Personal Computer Lifecycle
US11921858B2 (en)System and method for protecting against alterations in chain of trust sequences
KR102369874B1 (en)A system for remote attestation, os deployment server, attestation target device and method for updating operating system and integrity information simultaneously
CN119760696A (en)Key management and control method and chip starting method

Legal Events

DateCodeTitleDescription
ASAssignment

Owner name:MICROSOFT CORPORATION, WASHINGTON

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FRANK, ALEXANDER;ENGLAND, PAUL;REEL/FRAME:015689/0667;SIGNING DATES FROM 20041222 TO 20050104

ASAssignment

Owner name:MICROSOFT CORPORATION, WASHINGTON

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FRANK, ALEXANDER;ENGLAND, PAUL;REEL/FRAME:015973/0043;SIGNING DATES FROM 20050330 TO 20050501

STCFInformation on status: patent grant

Free format text:PATENTED CASE

CCCertificate of correction
FPAYFee payment

Year of fee payment:4

ASAssignment

Owner name:MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text:ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034543/0001

Effective date:20141014

FPAYFee payment

Year of fee payment:8

FEPPFee payment procedure

Free format text:MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPSLapse for failure to pay maintenance fees

Free format text:PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCHInformation on status: patent discontinuation

Free format text:PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FPLapsed due to failure to pay maintenance fee

Effective date:20200415


[8]ページ先頭

©2009-2025 Movatter.jp