COMPUTER SOFTWARE AUTHENTICATION, PROTECTION, AND SECURITY SYSTEM
BACKGROUND OF THE INVENTION
The present mvention relates to a computer program havmg enhanced security features, and also to a system and method for enhancing the security features of a computer program In particular, the present mvention relates to such a program, and the system and method for creatmg the program, havmg mcreased security features to prevent ID-Data (as defined hereafter) eavesdroppmg and/or theft and/or to ensure authenticity
DESCRIPTION OF THE PRIOR ART
Computers are becoming widely interconnected and heavily relied upon to process and store sensitive information The risk of unauthorised access to computers and information has mcreased with this mcreased mterconnectivity
Many secuπty advances exist m the areas of identification & authentication of users, cryptography, virus prevention, and the like, however - almost all of these advances ultimately rely upon computer software Most computer systems are, or are accessed by, small personal computers, and most software used on these personal computers is susceptible to "local attacks" - attacks which are mounted from mside said personal computers agamst said software by other software or people
Passwords, User-ID's, credit-card numbers and expiry dates, bank account and PIN numbers, smart-card data, biometric mformation (for example the data compnsmg a retina or fingeφrint scan), cryptographic keys, and the like are all examples of identification, authentication or similar data which is either sensitive m itself, or may allow access to sensitive, restncted or other information or services Hereafter, the term ID-Data will be used to refer to the abovementioned identification, authentication or similar data, excluding ID-Data which is valid only for a smgle use, or which is designed to expire at regular intervals of less than two mmutes
Illegal access to computer system information can be obtamed by exploiting vaπous secuπty flaws found m computer software products A common flaw is the susceptibility of said software to the theft of ID-Data either directly from said software as it executes, or from the operatmg system or hardware on which said software is executing Another common flaw is the susceptibility of said software to illegal modification Such modifications may remove, disable, or compromise the secuπty features of said software
Viruses, Termmate-and-stay-resideπt programs (TSRs), co-resident software, multithreaded operatmg system processes, Trojan Horses, Worms, Hackers, Spoof programs, key-press password capturers, macro-recorders, sniffers, and the like can be effective at stealing ID-Data and are examples of (a) rogue software or (b) people capable of subverting secuπty software or (c) software which can be configured for illegitimate purposes. Hereafter, the term rogue software will be used to refer to software or subversions such as the abovementioned (a) (b) and (c), used for the purpose of stealing ID-Data. The definition of our term "rogue software" when used herein also includes software or other means used to tamper with other software. The term tampering is defined hereafter. There are many ways to introduce rogue software into a computer system. Viruses spread automatically by introducing themselves. Trojan-Horses are usually introduced by tricking users into allowing them to execute (such as by masquerading as a new or well-known computer game or other product). Existing security problems may be utilised to introduce rogue software; some well known problems mclude Java bugs, errors, or oversights, ineffective physical security (for example: permitting rogue software to be introduced directly on floppy disk by an intruder), electronic mail attachments which automatically execute or execute after a simple mouse-click, incorrect security settings on intemet, world-wide-web, TCP/IP or modems, and tampering (see definition hereafter) with legitimate software in-transit as it flows from remote internet sites into a users computer, to name a few.
Rogue software, once introduced, can steal ID-Data as mentioned hereinbefore. It may monitor keyboard (for example: by recording every key, as the user presses each one, in order to steal a password as it is being typed in), serial-port, mouse, screen, or other devices to steal ID-Data directly from them. It may monitor other software, applications, the operating system, or disks to steal ID- Data from there also. Once stolen, this ID-Data may be stored locally (for example: in memory or on- disk) or transmitted to remote locations (for example: by modem or network) or used immediately to perform illegal operations. Hereafter, the term eavesdropping will be used to refer to the monitoring of a computer to record ID-Data.
For example, a key press recorder could secretly, and unbeknown to the computer user, record all the keys pressed by the user into a hidden systems file. The information recorded could mclude a user's password and other sensitive information which an organisation would obviously wish to protect.
Additionally, rogue software may remove, disable, or compromise existing computer software security features by modifying the memory, disk, or other image of said computer software. Rogue software may also utilise tampering techniques to alter existing computer software in order to steal ID- Data from it, or may attach itself to existmg computer software (as is the case with many conφuter viruses). Hereafter, the term tampering will be used to refer to die abovementioned modification of computer software. Tampering may take place either locally (within a users PC) or remotely (for example: at one ofthe points which a computer program passes through as it is being download).
Further, counterfeit software can be substituted for legitimate software. The counterfeit will appear real to a computer user, but actuaUy acts to subvert security, such as by stealing ID-Data. Sometimes called "Spoof" programs or Trojan Horses, counterfeit software of this type may invoke the original legitimate software after having stolen ID-Data, so as not to arouse a users suspicion.
Another potential security flaw found in computer software products is susceptibility to examination and reverse-engineering. Known (but generally secret) and other security problems or mistakes can be discovered by hackers and the like from the examination of existing computer software and by tracing its operation.
Additionally, Computer software piracy is a growing problem, and the existmg simple means which prevent this problem (such as registration or serial numbers and customer-names being encoded within the product) are becoming less effective.
There is necessity within the try-before-you-buy software market for vendors to employ effective features which allow old software to expire without fear of hackers or the like removing said expiry features and for secure registration of software to be provided through the use of software unlock-codes
There is also need for software to be able to prevent security attacks upon itself (ie: tampering) and upon its own attack-detection code. There may also be a future need for software to identify the attacker for subsequent prosecution.
There also exists cases where untamperable software usage metering may be desirable, and where effective password-protection of software execution may also be desirable.
Known advances in certain areas of computer security have been successful and documented.
There have been some advances in anti-virus technology which help detect and prevent certain security problems. There have been numerous advances in hardware-assisted computer security add-ons and devices, such as smartcards and biometric input devices. There have been advances in cryptographic techniques. Generally, all of these advances require authentic, un^tampered-with computer software in order to work. There have been relatively few advances in software-based integrity self-checking (eg: tamper protection), and no prior software-based advances in preventing eavesdroppmg or the electronic theft of ID-Data, and no prior software-based advances in self-authentication.
SUMMARY OF THE INVENTION
This invention describes a process which substantially enhances the security of computer software (hereafter refeπed to as the improved process) and a method by which to apply said improved process (hereafter referred to as the applicator).
The improved process consists of including computer code to automatically detect tampering of said computer software, and computer code to prevent the theft of ID-Data by replacing existmg vulnerable (to rogue software eavesdropping or attack) software or operating system code with secure equivalents which utilise anti-spy techniques (as described later in this document).
Preferably, the improved process also consists of including computer code to prevent de- compilation, reverse-engineering, and disassembly by the inclusion of obfuscating code inserts, and the use of executable encryption.
Preferably, the improved process also consists of includmg code to prevent execution-tracing and debugging by the use of code designed to detect and prevent these operations.
Preferably, the improved process consists of, or also includes, human-recognisable audio-visual components which permit the authenticity of said computer software to be easily verified by the user on each invocation using techniques described later in this document.
The idea which lead to the creation of this invention can be summarised as follows:- If a piece of computer software that is executing can be shown to be the genuine article, and this software can protect itself against eavesdropping, and this software can prevent tampering of itself, then is it possible for this software to function in a secure manner, even within an insecure operating system. This mvention permits the creation of such a piece of computer software - having a tangible, useful security advantage and hence improving its value.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig.1 illustrates the standard operation of a computer system known in the prior art;
Fig.2 illustrates the known operation of a rogue or "spoof program; Fig.3 illustrates apphcation code updated with the prefeπed embodiment;
Fig.4 illustrates the known operation of a rogue eavesdropping program;
Fig.5 illustrates the interaction ofthe components ofthe updated application;
Fig.6 illustrates the general structure ofthe prefeπed embodiment ofthe applicator,
Fig.7 illustrates a standard layout for a program to be executed on a computer system; Fig.8 illustrates the standard layout of an EXE header under the MS-DOS operating system.
Fig.9 illustrates a standard layout of an EXE program under MS-DOS;
Fig.10 illustrates an altered executable form constructed in accordance with the specific embodiment;
Fig.11 illustrates a first stage of execution ofthe new.exe executable;
Fig.12 illustrates a second stage of execution ofthe new.exe executable file; Fig.13 illustrates a third stage of execution ofthe new.exe executable file.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
As will be described hereinafter, the present invention has general applicability to many different operating systems including Microsoft DOS (Trade Mark), Apple Macintosh Operating Svstem, Unix OTrade Mark) etc.
Described hereafter are several security-enhancing techniques to combat eavesdropping. Security is provided by (a) hampering examination of software-code or operatmg system code or parts thereof through the use ofthe encryption or partial encryption of said code, (b) preventing the disassembly of said code mrough the inclusion of dummy instructions and prefixes and additional code to mislead and hamper disassembly (ie: obfuscating inserts), (c) preventing the computerised tracing of the execution of said code (for example: with code debugging tools) through the use of instructions to detect, mislead, and hamper tracing, (d) preventing tampering of said code through the use of scanning to locate alterations, either or both on-disk and in memory either once at the start of execution, or continuously upon certain events, or (e) preventing ID-Data theft through the inclusion of secure input output routines (for example: routines to bypass the standard operating system keyboard calls and use custom-written higher-security routines as a replacement) to replace insecure computer- system routines. Hereafter, the term anti-spy will be used to refer to any combmation of one or more ofthe abovementioned techniques [(a) through (e) or parts thereof] used to prevent eavesdropping.
Referring now to Fig.1 there is illustrated the standard scenario for "running" a given executable program 16, under the control of a computer operating system 17 on a computer IS. In the preferred embodiment ofthe present invention, die executable program 16 is subjected to modification, as will be described hereinafter, to ensure its integrity and improve its security.
There are five aspects of this inventions improved process, although said process is still substantially improved even if not all of them are present. These aspects are: (1) Preventing eavesdropping (2) preventing disassembly and examination (3) detecting tampering (4) preventmg execution-tracing and (5) ensuring authenticity.
The prefeπed embodiment of these aspects ofthe present invention will now be described.
Aspect 1. Preventing eavesdropping.
As hereinbefore described, it is desirable to prevent rogue software from eavesdroppmg on ID-
Data. By replacing software which is vulnerable to eavesdropping with equivalent software which is far more secure, this purpose is achieved. To remove the vulnerability from said equivalent software, replacement routines may communicate directly with the hardware ofthe computer (for example, they may communicate with the keyboard circuitry instead of using the system-supplied (and hence possibly insecure) application interface keyboard-entry function-calls.) while disabling system interrupts which would permit rogue software to eavesdrop. Said replacement routines are coded to store ID-Data retrieved in a secure manner. ID-Data is not stored in full in plaintext (ie: unencrypted) in system or apphcation buffers.
Aspect 2 Preventing disassembly and examination. As hereinbefore described, it is desirable to hamper disassembly (or de-compilation or reverse engineering) to protect software against eavesdroppmg and tampering, and to hinder examination of said software which might lead to secret security problems or mistakes being disclosed.
Obfuscating inserts can successfully prevent automatic disassembly. Obfuscation is achieved by foUowing unconditional jump instructions (for example, Intel IMP or CLC/JNC combmation or CALL (without a retum expected) or any flow-of-control altering instruction which is known not to return to the usual place) with one or more dummy op-code bytes which wiU cause subsequent op¬ codes to be erroneously disassembled (for example, the Intel OxEA prefix wiU cause disassembly of the subsequent 4 op-codes to be inconect, displaying them as the offset to the JMP instruction indicated by the OxEA prefix instead ofthe instructions they actuaUy represent).
Dummy instructions may also be mcluded to hamper disassembly by deliberately misleading a disassembler into believing a particular flow of control wiU occur, when in fact it wiU not.  Flow of control can be designed to occur based upon CPU flag values determined from instructions executed a long time ago. Together with tracing prevention, this makes manual disassembly nearly impossible.
The majority ofthe executable portions ofthe software can be encrypted for extemal storage. The decryption taking place in-memory after the software is loaded from extemal sources, under the control of a decryption 'header" which prevents its own tampering and disassembly etc. This makes manual and automatic disassembly nearly impossible, since the decryption should be designed to fail if tampering or tracing is detected.
Aspect 3 Detecting tampering.
As hereinbefore described, it is desirable to detect tampering, since this may lead to the reduction of software security.
This can be achieved wiui the use of code which is protected from disassembly and examination through obfuscation and encryption, which re-reads its own external-image and compares it with its known memory image or precalculated check-data to detect hot-patching (ie: tiie modification of software sometime after it has been loaded from disk, but (usually) before execution ofthe modified section has commenced).
Additionally, the software can scan the memory image of itself one or more times, or continuously, to ensure that unexpected alterations do not occur.
Certain modifications to the extemal copy of software are reflected in subtle changes to the environment in which the modified software will be executed (for example: the size ofthe code, if altered, will be reflected in the initial code-size value supplied to the executing program being incoπect ). Additionally, certain modification to the operatmg system and environment of said software can also be monitored (for example: certain interrupt vector table pointers in Intel-processor appUcations) to detect unexpected changes by rogue software. These changes can also be detected to prevent tampering.
Once tampering is detected, program flow-of-control needs to be changed so that the potential compromise associated with ID-Data theft is avoided. This may be the security-enhanced program terminating with a message indicatmg that its integrity has been compromised before aU ofthe ID- Data is entered. Altematively, the fact that tampering has been detected may be kept secret and the ID-Data retrieved, however, immediately upon retrieval, the ID-Data entered can be invalidated thus preventing access to that which the now potentially compromised ID-Data would have otherwise aUowed. This latter method aUows for the possibility of security-enhanced software informing remote or other authorities that tampering was detected and possibly other information, such as what specifically was altered and by whom. Care must be taken to ensure the integrity ofthe "remote- informing" code before ID-Data entry is permitted.
ft will be apparent to one skiUed in the art of low-level software programming that the five aspects described herein may be combined to provide substantiaUy stronger security than any aspect taken on its own. For instance, to combine tamper-detection with encryption, the precalculated check- data as derived during tamper-detection described hereinbefore may actuaUy be one part ofthe decryption-key which is required to successfuUy decrypt the remaining executable software. If prevention-of-tracing and environment characteristics (includmg debugger detection as described hereafter) are additional portions of said decryption-key, it makes tiie determination of said decryption-key by any person or computer program other than tiie secure original an extremely difficult, if not impossible, task.
Further, it will also be apparent to one skiUed in the art of low-level software programming that a simple construct such as a JNE to alter program flow-of-control after tampering has been detected is insufficient, since tiie JNE construct itself is subject to tampering. The denryption process described hereinbefore is preferable since there is no single point of alteration that can possibly yield a tampered executable that would execute. Indeed, the executable protected with encryption wiU not even be transformed into its intended form if tampering is detected.
Aspect 4 Preventing execution-tracing.
Apart from "spoofing" (described in aspect 5 hereafter) the last resort of a rogue who is prevented from disassembly, tampering, and eavesdropping on software is to trace the execution of said software in order to facilitate the compromise of its security. Hampering tracing (tracing is sometimes called debugging) prevents this.
There are numerous methods of detecting a debug-environment (ie: when tracing is taking place). When combined with decryption and tamper-protection as hereinbefore described, it makes the rogues task of detecting and bypassing debug-detection extremely difficult. Reference and examples to Intel and MS-DOS environments foUow hereafter, although it wiU be apparent to one skilled in the art that these and similar methods are apphcable on other platforms.
Standard Intel x86 interrupts 1 and 3 are used by debuggers to facilitate code tracing. By utilising these interrupts (which are not normally used by normal applications) in security-enhanced software, it hampers debugging, since buih-in debugging functions are now not automatically available.
Monitoring the system timer to determme if software execution has spent too long accomplishing certain tasks can detect a situation where code tracing has been in effect and a breakpoint was reached.
Disabling the keyboard wiU hamper debuggers, since tracing instructions are usually issued from the keyboard. Similarly, disabling other places from where tracing instructions are usuaUy issued (eg: serial ports, printer ports, and mouse) or displayed (eg: screen) wiU also hamper tracing.
System interrupts can be re-vectored for use witiiin tiie secure software to perform tasks not usuaUy performed by those interrupts. Debuggers usually rely upon system interrupts also, so to do this would usuaUy disable or destroy a debugger being used to trace the software.  Disabling interrupts and performing timing-sensitive instructions between them will further hamper debugging. When tracing software, instructions are usuaUy executed one-at-a-time in order for the user to understand tiieir operation. Many system interrupts must occur regularly (eg: timer and memory re-fresh operations), so debuggers usuaUy do not disable interrupts even when they encounter an interrupt-disabling instruction. If timers and the like are re-vectored in two separate stages, any timer (etc) interrupt occurring mbetween tiie two stages wiU fail, and usuaUy crash the computer.
Further, interrupts can be disabled or enabled using obscure means (with flag-altering instructions for example) to hamper tracing.
Discretely testing the status of disabled or enabled system facilities (eg: interrupts, keyboard, vector-pointers) to ensure that a debug-environment has not altered or by-passed them wiU seriously hamper tracing also.
Certain computer processors have instruction caches. In some circumstances, it is possible to aker the instructions immediately before the CPU encounters them, but the altered instruction will not be executed normaUy because the cache copy has tiie "old" one stiU. In debug environments, the cache is usuaUy flushed, so any altered instructions wiU actuaUy be executed. This again hampers tracing.
Using strong cryptographic schemes, such as DES, or RSA or the like will prevent the examination of any decryption routines from revealing a simple patch to disable said routines.
When tracing software, the program stack is usually used by the debugger either during the tracing operations or at other times. This is easily detected, and by using the area ofthe stack which will be destroyed by unexpected stack-use for code or critical data, software can be designed to self- destruct in this situation.
Scanning the command environment and the execution instruction can detect the execution of software by unusual means. Searching for "DEBUG" in tiie command line, or scanning memory for known debuggers for example wiU detect tracing. Additionally, by detecting which operating system process initiated the load ofthe software, unexpected processes (eg: debuggers) can be detected.
Monitoring system buffers (eg: the keyboard memory buffer) or hardware (eg: the keyboard circuity and intemal buffers) for unexpected use (eg: keyboard input and processing is occurring when the software is not requestmg it) will also detect debuggers, which usuaUy rely in part on system functions in order to operate.
Building a process or multiple processes which are traditionally difficult to trace, such as a resident or child process which executes during system interrupts or after the parent process has terminated will again hamper tracing.
Bypassing system routines (eg: in DOS, using direct memory writes instead of DOS system caUs to revector interrupts) wiU further hamper debugging and rogue software monitoring, as will unravelling loop constructs (which wiU make tracing long and cumbersome).
Code checksums and operating-system checks (eg: interrupt table pointers) can be designed to detect debug-breakpoint instruction inserts or other modifications. Using the result ofthe checksum for some obscure purpose (eg decryption, or (much later) control-flow changes) will further hamper tracmg
It will be apparent to one skilled m the art of low-level software programming that a combination of techmques to detect, prevent, and mislead tracmg wiU provide a mechamsm makmg tracmg very difficult, if not impossible At tiie very least, it wiU require an expert with very expensive tools and perhaps some understanding of he oπgmal software design a very long time to make any debugging progress - a situation which is recognised m military software secuπty accreditation worldwide as highly desirable
Aspect 5 Ensuring authenticity.
In accordance with an aspect ofthe present mvention there is provided a method of providing for a secure entry of ID-Data m a computer system compnsmg activating a visual display or animation and/or audio feedback (hereinafter called an audio/visual component) as part of said secure entry of ID-Data so as to hamper emulation of said secure entry process
Preferably, tiie animation mcludes feedback portions as part ofthe ID-Data entry process
Preferably, the animation is repeatable and vaned m accordance with the mformation entered
The animation preferably compnses 2 5D or 3D animation and mcludes animation of any ID-Data mput
Preferably, the animation is designed to tax the computer resources utilised and thereby makmg any forgery thereof more difficult
Notwithstanding any other forms which may faU within the scope ofthe present mvention, prefeπed forms ofthe mvention wiU now be descnbed, by way of example only, with reference to the accompanying drawings
In the prefeπed embodiment ofthe present mvention the user interface for the acquiring of ID- Data is secured whereby the duplication ofthe interface is rendered mathematically complex such that cipher-code breakmg techniques are required to produce a counterfeit look-ahke interface By makmg the authentication interface (ie ID-Data entry screen - for example a logon screen or a screen for enteπng credit card details) unable to be emulated, tampered with, or reversed engineered, the application program allows for a higher degree of secuπty and authenticity even m insecure environments such as the Intemet or home software applications
Referring now to Fig 2, there is iUustrated a classic form of rogue attack on a computer system In this form of rogue attack, a rogue's "spoof program 22 is inserted between application software 16 and the user 23 The apphcation 16 normaUy has a portion 24 devoted to ID-Data entry and veπfication or the entry of commercially sensitive mformation (including passwords etc) to the application m addition to the apphcation code 25 The spoof program 22 is designed to exactly reflect the presented user interface of ID-Data entry code 24 to the user The user 23 is then fooled mto utilising the masquerading spoof program 22 as if it was the apphcation 16 Hence the user can be tricked mto divulging secret mformation to the spoof program 22 An example may mclude a classic "login spoof wherein the spoof program 22 prints the login prompt (ie: ID-Data entry) message on the screen and the user mistakes the login prompt for a legitimate one, supplying a user name and password to this program 22 which records this mformation as weU as passing it on to the login code
24 of appUcation 16 so as not to arouse the suspicion of user 23 - or by issuing a message, such as "incorrect password, please try again" and tiien passing control to the login code 24 of appUcation 16.
Referring now to Fig.4, there is iUustrated a relatively new form of rogue attack 40. This form of attack proceeds similarity to the spoof attack of Fig.2, with tiie foUowing difference. Instead of a spoof program 22, a rogue program 41 is inserted which secretly eavesdrops on ID-Data entry code 24, or on appUcation code 25, or on operating system 17, or on hardware 18 or elsewhere in order to steal sensitive information directly from tiie legitimate appUcation. Smce the legitimate appUcation is stiU actuaUy executing, the users suspicion is not aroused, since rogue program 41 is generaUy invisible to the user 23. Altematively, executable program 16 may have been tampered with (as hereinbefore described) to reduce its security, aUeviating the necessity for tiie presence of rogue program 41.
In Fig.5, there is iUustrated in detail the structure of an appUcation 50 constructed in accordance with the preferred embodiment running on computer hardware 18. Fig.5 is similar to Fig.4 with tiie important difference that user 23 now communicates directly with secure drivers 51 which are part ofthe secure ID-Data entry program code 31 which is utilised by the security-enhanced (eg: tamper protected) application code 52. It can be seen that the user 23 no longer communicates with the operatmg system 17 or the unprotected computer hardware 18, thus the rogue program 41 can no longer eavesdrop on ID-Data.
In Fig.3, there is iUustrated, in more general terms than Fig.5, the structure of an apphcation 30 constructed in accordance with the prefeπed embodiment wherein secure ID-Data entry program code 31 is provided which is extremely difficult to replicate, eavesdrop upon or subvert. The secured ID- Data entry program code 31 can be created, utilising a number of different techniques.
Firstly, the executable portion ofthe secured ID-Data entry code can be protected agamst tracing, disassembly, tampering, viewing, reverse engineering, keyboard entry theft, eavesdropping, hot patching and other attacks by transforming the secured ID-Data entry program code 31 from its normal executable form 16 (Fig.2) to a corresponding secured form of executable (as hereinbefore described - refer aspects 1 to 4). These techniques are preferably appUed to the appUcation code 16 in general or less preferably SpecificaUy limited to the ID-Data entry portions 24 tiiereof.
Additionally, the secure ID-Data entry program code 31 is itself created. This code 31 preferably comprises a complex graphical user interface series of screens and animation designed to make duplication by a rogue thereof extremely difficult.
Initially, the complex user interface should include facilities to disable any frame buffer recording devices, the disablement occurring before each frame is displayed. Also, where a multi¬ tasking operating system is in use, or where context switching is enabled, switching out ofthe interface screen is preferably disabled or ID-Data entry procedures encrypted or terminated when the interface screen is swapped out. The images presented which form part ofthe ID-Data entry screens comprise complex 3D animation sequences having a high degree of complexity and extensive use of screen colours and screen resolution in addition to visual design so as to make copying tiiereof extremely difficult.
The complex computer graphics can be created utilising standard techniques. For mformation on how to create complex 3D imagery, reference is made to "Computer Graphics, Principles and
Practice" by Foley, Van Dam et al, pubbshed 1990 by Addison-Wesley Publishing Company or other standard textbooks on generation of computer graphics . Reference is also made to the numerous intemet news groups and archives on graphics and games programming, SpecificaUy to: comp .graphics. research, comp.graphics. rendering, comp.graphics. raytracing, comp. graphics. misc, comp.graphics.digest, ccinp.graphics. animation, comp.graphics. algorithms, comp.graphics, ah. graphics pixutils, ah.graphics, rec.games.programmer, comp. sys programmer, comp.sys.ibm.programmer, comp.sys.ibm.pc.programmer, comp os.msdos programmer, comp.msdos.programmer, ak.msdos.programmer. Reference is also made to "PC Games Programmers Frequently Asked Questions" document avauable on the intemet, via rec.games.programmer and elsewhere.
By encodmg a complex 3D image which forms part ofthe ID-Data entry screens, the hurdle requirement of a rogue to reverse engineer the complex imagery is substantiaUy mcreased. The inclusion of graphical animation is advantageous in preventing static screen shot duplication attacks by a rogue form succeeding.
As noted above, it is preferable that traditionally difficult graphical programming techniques are employed wherever possible, with the aim of making it more detectable for a user interacting with the system to discern lesser copies ofthe animation. Suitable 3D animation can mclude the introduction of shadows, the lighting of pseudo-3D animated objects, transparent or translucent objects, shiny, reflective, or miπored objects, gravitational effects in animated objects, single-image-random-dot- stereogram bitmaps or backdrops, translucent threads, effects, such as diffraction pattems, screen masks, backdrops, colour palette "animation", complex animated objects resistant to simple hidden- surface removal techniques known to those skilled in the art and directed to hindering duplication
Further, the animation can take into account:
1. Thwarting attempts at compression ofthe ID-Data entry screens. This can be achieved by having animation which has low visual entropy and having many graphical elements which are altered from frame to frame in a manner which is highly discernible to the human viewer. Apart from being difficult to replicate, complex 3D computer imagery having low entropy or redundancy wiU require large amounts of storage space for a rogue attempt at duplication based on recording the screen output and therefore be more readily discernible to the user should this form of attack be mounted.
2. The animation is further preferably designed to thwart a successful replay attack which is based on providing only a subset (limited number of frames) ofthe screen animation to a viewer. This can be achieved, for example, by the inclusion of several animated spheres which "bounce" around the screen and change colours in a manner that is recognisable to the viewing user but which is not readily repeatable. A replay of only a subset of tiie screen animations to the viewer wiU be highly evident in this case when, upon looping, the user is alerted to a problem when the animation "skips" or "jumps" and does not operate in a previously smooth manner. This makes it difficult for a rogue spoof program to copy the animation without including aU parts of it.
3. Most importantly, the graphics presented can be customised to the input data entered. For example, the information entered by a user can be rendered and or animated by the secure ID-Data entry program code 31 (Fig.3). As an example, in an ID-Data entry program, when a user types in their user name, the animation can be created letter by letter. For example, when typing in the user name "CHRIS" each letter could be rendered differently depending on those characters previously typed. For example, tiie letter "I" might appear as a large "barbers-pole" which spirals and changes colour, speed, size, and or position and is slightly transparent, thereby aUowing the animated seen which is a backdrop to the character to be discerned through the character itself. For example, in the above example, the letter "I" would only appear as the specific animated barbers pole that is does if tiie previous letters entered were "C", "H", and "R" respectively.
The utilisation of a unique sequence of animation based on a user's input of information sensitive data increases the difficulty of creating any "spoof program" attack on tiie apphcation 30. This is especiaUy the case since the executable code of appUcation 30 is preferably in an encrypted form. The use of animation being particular to the order in which characters are entered is particularly advantageous as the computational complexity of repUcation is substantially mcreased.
A similarly effective animation technique is to produce only one graphical object after entry of each portion of ID-Data, sudi as a computer-generated human's face, but have the features of said face be determined by a hash or cryptographic function based upon the users input. For example, after entry ofthe ID-Data "CHRIS" (in tiiis example, the individual characters may not, themselves, be based on the abovementioned generation procedure) , a teenage girl's face with long blonde hair and blue eyes may be displayed. If the "S" was instead a "D", the face would be entirely different. The ID-Data used for producing an object for display should not be ID-Data which is designed not to appear on-screen when entered (eg: a password), since the display of a conesponding object would give a rogue information on which to base guesses ofthe secret ID-Data.
By utilising cryptography or having complex formulas to determme the sequencing of animation, the rogue programming the coπesponding spoof program shall have to crack the cryptographic scheme in order to get the selection of character animation correct for any generalised attack. In the abovementioned example, a rogue will have to determine the algorithm for producing the face, since human beings are adept at recognising faces, and will immediately notice if the face displayed on the screen is incorrect. Such a technique allows for a mathematically secure, visual method to guarantee the authenticity ofthe software which generates the screen feedback. The user of the software is instructed to note their own particular animation sequence and to immediately discontinuing utilisation ofthe appUcation 30 should that sequence ever change. The user may also be instructed to contact a trusted person, such as the supplier or operator of tiie appUcation to confirm tiiat tiie animation sequence they witness is the authentic sequence intended by said supplier.
Further, the particular animation presented for a particular appUcation 30 can be further customised for each appUcation so as to be distinct (such as by the incoφoration ofthe applications name as part ofthe animated image).
Further hindrance for a rogue programmer can be created by hand coding portions ofthe animation in assembly language so as to generate tiie maximum possible complexity and interaction in the animation with tiie highest level of detail for individual workstation computers. This further raises a hurdle aUowing for the easier detection of rogue spoof programs 22 which wiU often be written in a more convenient, higher level language (such as C or C++) which wiU also operate at a different speed, the user being instructed to look for speed differences.
Further, animated scene timing can be utilised, providing anti-looping and frame removal detection is stiU catered for. The animated scene timing aUows for a user to detect unexpected iπegularities in a frequently presented animated interface. By including in the animation some deUberate regularity (such as the rhythmic convergence of some parts ofthe animation in one particular spot), a rogue programming a spoof program shall also have to duphcate the preferably complex timing events necessary to accomplish tins convergence. The regular nature ofthe scene timing should be high enough so that tiie user expects to see certain events and thereby making it difficult for a rogue spoof program to copy the animation without including aU parts of it.
Preferably, where possible, aU ID-Data is immediately encrypted which makes recovery ofthe ID-Data by a rogue through analysis ofthe computer program memory difficult. Preferably, public- key cryptographic methods (eg: Elliptic-curve, RSA or Diffie-Hellman cryptography) should be used making it impossible to reverse engineer ti e cryptographic code to decrypt any sensitive mformation should rt be stolen in its encrypted form. Prohibiting all or most interrupts when data is to be entered and encrypting or hashing the sensitive information immediately so that it is only stored partially, or in an encrypted form, before re-enabling interrupts is one example of achieving this objective.
As a further alternative, analysis of a user's personal characteristics can be mcluded as part of the interface. This can include attempts at recognition of a user's typing style (duration of keypresses, delays between subsequent keys, choice of redundant keys, mouse usage characteristics, etc) or by additional authentication techniques, including smartcards, biometric inputs such as finger prints detectors etc.
Further, the graphical animation routines can be "watermarked" by the secure ID-Data entry program code in that "hidden" mformation may be incoφorated into the scene (for example "salted- checksums") to aUow careful analysis ofthe ouφut of secure ID-Data entry program code 31 to distinguish between original graphics animation and counterfeit animation. For example, the hidden information may be encoded in the least-significant bit of pixel data at selected locations ofthe animation.
The user determinable sequence of animation can also extend to tiie provided audio animation .
For example, audio and other feedback techniques including music and speaking tones can be played in response to particular key stroke combinations. By utilising different voices and/or tones and/or volumes and pitches for each keystroke or combination, the security ofthe appUcation 30 can, once again, be substantiaUy increased. The change in voice intonation will be readily "learnt" by a user and thereby further inhibit a rogue's ability to duplicate the same sequence of sounds or voices . Of course, the encoding ofthe voice system should be in an encrypted form.
Further, upon detecting any attempt to subvert the secure ID-Data entry program code 31 (eg: subsequent to detecting tampering), a notification message is preferably sent to a prosecuting body or the like where the appUcation 30 is currently, or later becomes connected to a network such as the Intemet, or by other means (eg: via Modem or by including coded information in pubUc or other files).
For appUcation programs 30 requiring activation by a host program executed on a different computer, a secure means of activation can be incoφorated into the chart appUcation 30. The host and cUent intercommunication can issue chaUenge and response code authentication and verification utilising cryptographic systems such as public-key encryption and/or other standard means of overcoming data replay attacks and other threats designed to trick the secure client appUcation 30 into activation.
It would be appreciated by a person skiUed in the art that the process of coding any data entry process utiUsing these techniques, together with additional techniques to protect agamst recording, and eavesdroppmg, and executable protection techniques may be necessary to improve tiie security ofthe interface. AdditionaUy, executable encryption, additional authentication, and otiier methods are desirable in producing the protected executable.
It would be appreciated by a person skilled in the art that numerous combinations, variations and/or modifications may be made to the present invention as described without departing from the spirit or scope ofthe invention as broadly described. The present embodimarts are, therefore, to be considered in all respects to be illustrative and not restrictive.
Summary of he Applicator (of an improved process of security as hereinbefore described)
The prefeπed embodiment ofthe present inventions' method (hereinbefore described as the "apphcator") by which to apply an improved process of security (as hereinbefore described) will now be described with reference to the accompanying drawings .
Referring now to Fig.7, there is shown a standard format utilised for storing executables on disk, often occurring in the art, and in particular in conjunction wrth programs run on the above mentioned operating systems. The standard executable 16 normally comprises a header section 71, a code section 72, and a data section 73. The header section 71 normaUy stores a standard set of mformation required by the computer operatmg system 17 (Fig.l) for running ofthe executable 16. This can include relocation data, code size etc. The code section 72 is normally provided for storing the "algorithmic" portion ofthe code. The data section 73 normally is utilised to store the data, such as constants, or overlays 92 utilised by the code section 72.
Turning now to Fig.6, the prefeπed embodiment of an applicator program 60 is shown which takes as its input the executable program 16 and performs an obfuscating step 61, a ciphering step 62 and an anti-key press and authentication step 63 (described hereafter) which perform various transformations on the executable program 16 to produce a new executable program 30.  The obfuscating step 61 modifies the header 71 (Fig. 7) ofthe executable 16 in addition to inserting loading code which wiU be described hereinafter. The cipher step 62 encrypts tiie existing executable 16 and calculates check data (eg: a checksum) for the encrypted executable. The anti-key press and authentication step 63 replaces various insecure system caUs wrth safe equivalent code and preferably inserts code to graphicaUy represent the integrity of said executable program.
The newly formed executable 30 (new.exe) can be then stored on disk and the appUcator program 60 completed, tiie new executable 30 replacing tiie old executable program 16.
When it is desired to run tiie replacement executable program 30, the replaced executable 30 (new.exe) executes the obfuscating code, previously inserted by appUcator 60. The obfuscating code initiaUy decrypts the executable program and vaUdates the stored check-data before re-executing the decrypted executable.
The foregoing description ofthe prefeπed embodiment has been in general terms and it wiU be understood by those skiUed in tiie art that ti e invention has general appUcation to many different operating systems, includmg MS-DOS, Apple Macintosh OS, OS/2, Unix etc.
The most common operating system utilised today is tiie MS-DOS operating svstem. This operating system is designed to run on INTEL x86 microprocessors and mcludes a large number of historical "quirks" which give rise to greater complexity than would perhaps be otherwise required when designing a new operating system from "scratch". For illustrative purposes, there will now be presented a specific embodiment ofthe preferred embodiment designed to operate under tiie MS-DOS operatmg system. Unfortunately, the example is quite complex as it operates in the framework ofthe MS-DOS operating system. Therefore, it is assumed that the reader is familiar wrth systems programming under the MS-DOS operatmg system. For an extensive explanation ofthe inner workings ofthe MS-DOS operating system, reference is made to standard texts in this field. For example, reference is made to "PC Intern" by Michael Tischer, published in 1994 by Abacus, 5370 52nd Street, S.E. Grand Rapids, MI 49512. A second useful text in this matter is "PC Architecture and Assembly Language" by Barry Cauler, published 1993 by Carda Prints, 22 Regatta Drive, Edgewater, WA 6027, Australia.
The specific embodiment ofthe present mvention will be described with reference to altering an "EXE" executable program under DOS in accordance wrth tiie principles ofthe present invention.
Referring now to Fig.9, there is shown the structure 90 of an executable ".EXE" program in
MS-DOS as normally stored on disk. This structure is closely related to the structure 16 of Fig. 7 which iUustrates the more general case. The structure 90 mcludes a header 71, otherwise known in MS-DOS terminology as the program segment prefix (PSP). This is normally foUowed by a relocation table 91 which contains a Ust of pointers to variables within a code area 72 which must be updated wrth an offset address when the program is loaded into a particular area of memory. The operation ofthe relocation table is weU known to those skiUed in the art of systems prograrnming. The next portion of structure 90 is tiie code area 72 which contains the machine instructions for operation on tiie x86 microprocessor. This is foUowed by a program data area 73 which contams the data for code area 72. Finally, there may exist a number of overlays 92 which contain code which can be utihsed in a known manner.
Referring now to Fig.8, there is shown the structure of EXE file header 71 in more detail. The table of Fig.8 being reproduced from page 750 ofthe above mentioned Tischer reference. It should be noted that the header 71 mcludes a number of fields including, for example, a pointer 81 to the start of the code 72 (Fig. 7) and a pointer 82 to the relocation table 91 (Fig.9).
In the specific embodiment, the appUcator program 60 (Fig. 6) proceeds by means ofthe foUowing steps:
(1) The executable program 16 is opened for reading and a determination made of its size.
(2) The header 71 (Fig.9) of executable program 16 is then read in and a copy is stored within appUcator program 60. A copy ofthe header 71 is written out to form part 101 ofthe new.exe file 30 as Ulustrated in Fig.10.
(3) Next, from the fields 81, 82 ofthe header 71 (Fig. 8) a determination is made ofthe size of relocation table 91 of executable program 16.
(4) Next, determination is made ofthe size ofthe executable code 72 and data portions 73. (5) The relocation table 91 is then read into the memory of tiie appUcator program 60. As noted previously, tiie relocation table 91 consists of a series ofthe pointers to positions within code segment 72 which are required to be updated when loading the program.exe file into memory for execution. The relocation table is sorted 93 by address before being written out to the new.exe executable file at position 102. (6) As noted previously, the relocation table 91 consists of a series of pointers into code area
72. A determination is made ofthe size of a code, known as the "netsafe 1" code 104, the contents of this code will be described hereinafter. Next, a search is conducted ofthe sorted relocation table 102 to find an area between two consecutive pointers within code section 72 which is of greater magnitude than the size of netsafe 1 code 104. This area 94, designated part B in Fig.9 is located. If this code portioned 94 cannot be located the appUcator program 60 exists wrth an eπor condition.
Upon finding code portion 94, the code portion 95, also denoted part A is encrypted and copied across to form new code portion 103. Code portion 94 is then encrypted and copied to an area 105 of new.exe 30. The netsafe 1 code 104 is then inserted by appUcator 60. Code portion 96, also denoted part C is encrypted and copied across to form code portion 106. Data portion 73 and overlay portion 92 are copied into new.exe 30 as shown. A second portion of obfuscating code, denoted "netsafe 2" 107, the contents of which will be described hereinafter, is then inserted after overlays 92 and before code portion part B 105.
(7) The header 101 is then updated to reflect the altered layout ofnew.exe executable 30. AdditionaUy, the initial address 109 of execution stored in header 101 is altered to be the start of netsafe 1 portion 104.
(8) As mentioned before, code portions 103, 106 and 105 are subjected to encryption or encipherment in accordance wrth step 62 of Fig.6. The encryption scheme utilised can be subjected to substantial variation. In this embodiment, the DES standard encryption scheme was utilised This scheme reUes on a fifty-six bit key for encryption and decryption and is weU known in tiie art.  Once encrypted, it is necessary to store the decryption key in new.exe executable 30. A number of different methods can be utilised to store the key. The prefeπed method is to spread portions ofthe key to different positions within the executable 30. For example, bits ofthe key can be stored within the netsafe 1 code 104 and netsafe 2 code 107. AdditionaUy, bits ofthe key can be stored within header portion 101. Also, it is envisaged that bits ofthe key can be stored in tiie condition codes which are a consequence of execution of various instructions within netsafe 1 area 104 and netsafe 2 area 107 and/or tiie operating system 17 (Fig.5), with the overaU requirement being that the key can be later extracted using a predetermmed algorithm.
(9) The next step is to patch tiie address ofthe start of code area 72 and netsafe 2 code area 107 into the required locations within netsafe 1 area 104.
The netsafe 1 area is then written to the file containing new.exe executable 30.
(10) The area 106 is then encrypted as aforementioned and written to the executable 30 foUowed by overlays 92 and encrypted netsafe 2 code portion 107.
(11) As wiU become apparent hereinafter, upon execution ofnew.exe executable 30, netsafe 2 area 107 is responsible for loading code portion 105 over tiie top of netsafe 1 area 104. Therefore, it is necessary to write the relevant addresses ofthe start and end of code portion 94 to the required position within netsafe 2 area 107.
(12) As wiU be described hereinafter, netsafe 2 area 107 is also responsible for decrypting the encrypted portions of codes 103, 104, 105, 106, and 107 and hence tiie netsafe 2 area 107 must also store this combined code size for later use on decryption .
Finally, a overall checksum for new.exe 30 is calculated and stored at the end ofthe file at position 108. This checksum is later used to verify the decryption procedures ' success and to prevent the execution of "scrambled" code, which would be the result ifnew.exe 30 were tampered wrth.
As will be further described hereinafter, netsafe code areas 104 and 107 contain code to decrypt the encrypted areas ofthe new.exe 30, to repatch code portion 105 back to its original position, and to replace potentiaUy insecure routines or easϋy spoofed screens normaUy utilised by the apphcation (eg: unsafe keyboard drivers) wrth an altemative safe form of routine.
Upon execution of tiie new.exe executable 30, the executable starts at the start of netsafe 1, area 104 (Fig.11), as this address has been previously patched into position 109 (fig.10) of header 101 (Fig.10). The netsafe 1 area 104 then performs the followmg steps (Al) to (AIO):
(Al) The first step is to disable aU the interrupts apart from those necessary for continued operation ofthe computer device 18 (Fig. 1) (for example, memory refresh cannot be disabled). The disabling of interrupts mcludes the disabling ofthe keyboard interrupt in order to stop amateur "code snoopers" from determinmg the operation ofthe code area 104.
(A2) The next step is to inteπogate the calling environment of tie operatmg system stack to ensure tiiat the program new.exe was not caUed by a debugging program which is tracmg the operation ofnew.exe. Additionally, the data variables necessary for operation of netsafe 1 code area
104 are defmed to be on the operating system stack (Refer Address OEH and 10H in Fig.8). This stack will change unexpectedly when in a code snooping or debugging environment and will cause the debugger to crash, thereby stopping a it from foUowing the operation ofnew.exe executable 30.
(A4) The interrupt trap addresses are then altered in a two stage process. The first stage resets a first part ofthe SEG:OFF address format and occurs at this point with a second stage occurring at a later time as will be further described herein below. By staging the alteration of interrupt trap addresses, any code snooper will be further confused as said trap addresses wiU initiaUy be garbage.
(A5) Any input from tiie keyboard is further disabled by infoπning the MS-DOS operating system to ignore any received keys.
(A6) The second stage ofthe revectoring ofthe normal debugging inteπupts is then applied so that the normal debugging interrupts can be used by the decryption code, to be described hereinafter, thereby making debugging almost impossible.
(A7) A check is then made to ensure that the above processes have been successful in that the debugger interrupts do not point to any debuggers, the keyboard is stiU disabled and the operatmg system has disabled the acceptance of keys from the keyboard.
(A8) The key for decryption is then reconstructed utilising the reverse process to that utilised in storing the mformation located in the key.
(A9) Turning now to Fig. 11, there is shown the standard format ofthe executable new.exe 30 when executing in memory. As will be well known to those skiUed in the art, an executing program under the MS-DOS system will include a stack 111 and work space 112. A memory allocation (MaUoc) call is then done to set aside an area 113 for the loading in of tiie netsafe 2 code 107 of Fig.10. The disk copy ofnew.exe 30 (having the format shown in Fig.10) is tiien opened by the netsafe 1 code 115 and an encrypted copy of netsafe 2 code 107 (Fig.10) is then loaded in from the disk file, decrypted and stored in memory area 113. The relocatable pointers of he code contained within the netsafe 2 code 113 are then updated to reflect tiie position ofthe executable in memory.
(AIO) Control is then passed to netsafe 2 code 113.
The code area netsafe 2, 113 then performs the followmg steps (Bl) to (B4):
(Bl) The portion of code ofthe disk copy denoted part B, 105 (Fig.10) is read in from disk in an encrypted format and written over the old netsafe 1 code 115.
(B2) As will be further described hereinafter, the netsafe 2 area 113 includes a number of keyboard routines which are preferably stored in an encrypted format. Therefore, the next step is to apply the decryption to any ofthe encrypted areas of netsafe 2 code area 113. After decryption, the netsafe 2 area 113 is checksummed and the result is tested against a prestored checksum to ensure tiie integrity of netsafe 2 area 113.  (B3) The disk copy ofthe new.exe is then again read in and checked agamst prestored check data to ensure that it has not been changed. Additionally, an attempt is made to read past the end of file ofthe disk copy ofnew.exe 30 (Fig.10) to ensure that no extension (eg: viral) has occuπed.
(B4) The encrypted portions ofthe memory copy (Fig.11) ofnew.exe are then decrypted utilising the key and once decrypted, the decrypted portions are again checked and tested against predetermmed data.
The next step in execution of he netsafe 2 code 113, is to replace insecure (eg: keyboard) system routines with a more secure method. Referring now to Fig.12, tiiere is shown the current state ofthe new.exe executable in memory. The insertion of he more secure system routines tiien proceeds in accordance with the foUowing steps (C 1 ) to (C5) :
(Cl) Firstly, a second memory allocation is done to set aside an area 51 (Fig. 13) for the storing ofthe secure hardware routines (eg: keyboard). These routines are then copied from their area within netsafe 2 code 113 to the memory area 51.
(C2) Next, the ID-Data entry routines which are ncirmaUy activated bythe interrupt table 131 when dealing wrth ID-Data input are altered such that, rather than pointing to correspondmg areas of tiie MS-DOS operating system 17, they point to the corresponding secure area 51. These interrupts mclude interrupt 9 which occurs when a key is pressed on a keyboard, interrupt 29h which reads a key and interrupt 16h which tests for the presence of a key.
(C3) The executable 30 (Fig.13) is then ready for execution and the registers are initialised, the memory area 113 deaUocated & control passes to the original start address of executable program 16.
(C4) It wiU be evident, that when executing, all keyboard calls (or otiier ID-Data entry caUs, if other than keyboard) will be passed to keyboard (or other) routines 51 wrth the keyboard hardware being inteπogated directly by keyboard routines 51 to retum mformation to the calling program. Keyboard routines 51 mclude a copy ofthe coπect interrupt vector addresses for each keyboard routine and each time they are called, a check is made ofthe interrupt table to ensure that it has not been altered. Preferably, keyboard routines 51 protect the keyboard hardware by issuing controller reset or similar commands to flush the keyboard data out ofthe circuitry after said data is retrieved to prevent hardware eavesdroppmg, or routines 51 utilise the protected mechanisms ofthe central processor to protect said hardware from eavesdroppmg.
(C5) When the executable 30 terminates, interrupt 21h (an MS-DOS standard) is called This interrupt is also revectored to a coπesponding area of routines 51. The termination code of keyboard routine area 51 restores the correct interrupt pointers in interrupt table 131 to point to the MS-DOS operating system 17, and clears the no-longer-needed program and data from memory before returning to the DOS operatmg system by calling the real interrupt 21.
The foregoing describes only one particular embodiment ofthe present mvention, particularly to the operation ofthe MS-DOS operating system, ft will be evident to those skiUed in the art, tiiat the principles outlined in the particular embodiment can be equaUy appUed to otiier operating systems in accordance wrth the objects ofthe present invention. Further, modifications, obvious to those skilled in the art, can be made thereto without departing from the scope ofthe invention.
EXPLANATION AND PURPOSE OF CLAIMS
Claims 1,2, and 3 are independent. The invention in claim 1 covers any high security software protecting ID-Data by utihsing anti-spy techniques, and tamper-protecting itself. Claim 2 is for a metiiod of producing high security software, such as, but not Umited to, that in claim 1. Claim 3 is for a new process of graphically representing the authenticity of high security software, such as, but not Umited to, that in claim 1 or produced by claim. 2.
Claims 4, 5, 6, 7, 8, and 9 add prefeπed components to the high-security enforcing functions of the software in claim 1. Claim 10 adds a tracing-prevention prefeπed component to claim. 9
Claims 11, 12, 13, 14, 15, 16, 50, and 53 add prefeπed components to the security-applicator method of claim 2.
Claims 17 to 49 inclusive and claims 51 & 52 outlines the specific area of protection that this invention affords a computer program acting as a user interface (eg: ID-Data entry screen). Specifically, they specifies how this invention appUes in the areas of protecting an interface against counterfeiting (i.e.: hampering the possibility that a fake copy of said interface can be successfully presented to a user to fool said user into entering information into the fake interface), and protecting an interface against malicious (or otherwise) tampering, examination, emulation, and eavesdropping.