Go
Cyber Security HomeSecurity AwarenessCyber Security For KidsPA-ISAC
Cyber Security
Security Awareness
Resources and Tips
Security Assessment Framework
Glossary
Cyber Security for Kids
Anti Virus
Best Practices
Events
Commonwealth Employees
Local Government
PA-CSIRT
Information Technology
 
Log In
Security Awareness > Glossary
Cyber Security Glossary of Terms  

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Access. A specific type of interaction between a subject and an object that results in the flow of information from one to the other. The capability and opportunity to gain knowledge of, or to alter information or materials including the ability and means to communicate with (i.e., input or receive output), or otherwise make use of any information, resource, or component in a computer system.

Access Control. The process of limiting access to the resources of a system to only authorized persons, programs, processes, or other systems. Synonymous with controlled access and limited access. Requires that access to information resources be controlled by or for the target system. In the context of network security, access control is the ability to limit and control the access to host systems and applications via communications links. To achieve this control, each entity trying to gain access must first be identified, or authenticated, so that access rights can be tailored to the individual.

Accreditation/Approval. The official management authorization for operation of an MIS. It provides a formal declaration by an Accrediting Authority that a computer system is approved to operate in a particular security mode using a prescribed set of safeguards. Accreditation is based on the certification process as well as other management considerations. An accreditation statement affixes security responsibility with the Accrediting Authority and shows that proper care has been taken for security.

Adequate Security. Security commensurate with the risk and magnitude of the harm resulting from the loss, misuse, or unauthorized access to or modification of information. This includes assuring that systems and applications used by the agency operate effectively and provide appropriate confidentiality, integrity, and availability, through the use of cost-effective management, personnel, operational and technical controls.

ADP. Automatic Data Processing. See also: Management Information System

Application. A software organization of related functions, or series of interdependent or closely related programs, that when executed accomplish a specified objective or set of user requirements. See also: Major Application, Process.

Application Owner. The official who has the responsibility to ensure that the program or programs, which make up the application accomplish the specified objective or set of user requirements established for that application, including appropriate security safeguards. See also: Process Owner.

Audit. To conduct the independent review and examination of system records and activities.

Audit trail. A set of records that collectively provides documentary evidence of processing. It is used to aid in tracing from original transactions forward to related records and reports, and/or backwards from records and reports to their component source transactions.

Automatic Data Processing (ADP).The assembly of computer hardware, firmware, and software used to categorize, sort, calculate, compute, summarize, store, retrieve, control, process, and/or protect data with a minimum of human intervention. ADP systems can include, but are not limited to, process control computers, embedded computer systems that perform general purpose computing functions, supercomputers, personal computers, intelligent terminals, offices automation systems (which includes standalone microprocessors, memory typewriters, and terminal connected to mainframes), firmware, and other implementations of MIS technologies as may be developed: they also include applications and operating system software. See also: Management Information System.

Authenticate/Authentication. 1) The process to verify the identity of a user, device, or other entity in a computer system, often as a prerequisite to allowing access to resources in a system. 2) A process used to verify that the origin of transmitted data is correctly identified, with assurance that the identity is not false. To establish the validity of a claimed identity.

Authenticated user. A user who has accessed a MIS with a valid identifier and authentication combination.

Authorization. The privileges and permissions granted to an individual by a designated official to access or use a program, process, information, or system. These privileges are based on the individual's approval and need-to-know.

Authorized Person. A person who has the need-to-know for sensitive information in the performance of official duties and who has been granted authorized access at the required level. The responsibility for determining whether a prospective recipient is an authorized person rests with the person who has possession, knowledge, or control of the sensitive information involved, and not with the prospective recipient.

Availability. The property of being accessible and usable upon demand by an authorized entity. Security constraints must make MIS services available to authorized users and unavailable to unauthorized users.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Back-up. A copy of a program or data file for the purposes of protecting against loss if the original data becomes unavailable.

Back-up Operation. A method of operations to complete essential tasks as identified by a risk analysis. These tasks would be employed following a disruption of the MIS and continue until the MIS is acceptably restored. See also: Disaster Recovery, Contingency Operations.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

C2. A level of security safeguard criteria. See also: Controlled Access Protection, TCSEC.

Capstone. The U.S. Government's long-term project to develop a set of standards for publicly-available cryptography, as authorized by the Computer Security Act of 1987. The Capstone cryptographic system will consist of four major components and be contained on a single integrated circuit microchip that provides non-DoD data encryption for Sensitive But Unclassified information. It implements the Skipjack algorithm. See also: Clipper.

Certification. The comprehensive analysis of the technical and nontechnical features, and other safeguards, to establish the extent to which a particular MIS meets a set of specified security requirements. Certification is part of the accreditation process and carries with it an implicit mandate for accreditation. See also: Accreditation.

Channel. An information transfer path within a system or the mechanism by which the path is affected.

Cipher. An algorithm for encryption or decryption. A cipher replaces a piece of information (an element of plain text) with another object, with the intent to conceal meaning. Typically, the replacement rule is governed by a secret key. See also: Encryption, Decryption.

Classification. A systematic arrangement of information in groups or categories according to established criteria. In the interest of national security it is determined that the information requires a specific degree of protection against unauthorized disclosure together with a designation signifying that such a determination has been made. See also: Limited Official Use.

Clear or clearing (MIS Storage Media). The removal of sensitive data from MIS storage and other peripheral devices with storage capacity, at the end of a period of processing. It includes data removal in such a way that assures, proportional to data sensitivity, it may not be reconstructed using normal system capabilities, i.e., through the keyboard. See also: Remanence, Object Reuse.

Clipper. Clipper is an encryption chip developed and sponsored by the U.S. government as part of the Capstone project. Announced by the White House in April 1993, Clipper was designed to balance competing concerns of Federal law-enforcement agencies and private citizens by using escrowed encryption keys. See also: Capstone, Skipjack.

Commercial-Off-The-Shelf (COTS). Products that are commercially available and can be utilized as generally marketed by the manufacturer.

Compromise. The disclosure of sensitive information to persons not authorized access or having a need-to-know.

Computer Fraud and Abuse Act of 1986. This law makes it a crime to knowingly gain access to a Federal Government computer without authorization and to affect its operation. [18 USC 1030]

Computer Security. Technological and managerial procedures applied to MIS to ensure the availability, integrity, and confidentiality of information managed by the MIS. See also: Information System Security.

Computer Security Act of 1987. The law provides for improving the security and privacy of sensitive information in "federal computer systems"--"a computer system operated by a Federal agency or other organization that processes information (using a computer system) on behalf of the Federal Government to accomplish a Federal function." [PL 100-235]

Confidentiality. The condition when designated information collected for approved purposes is not disseminated beyond a community of authorized knowers. It is distinguished from secrecy, which results from the intentional concealment or withholding of information. [OTA-TCT-606]
Confidentiality refers to: 1) how data will be maintained and used by the organization that collected it; 2) what further uses will be made of it; and 3) when individuals will be required to consent to such uses. It includes the protection of data from passive attacks and requires that the information (in an MIS or transmitted) be accessible only for reading by authorized parties. Access can include printing, displaying, and other forms of disclosure, including simply revealing the existence of an object.

Configuration Management (CM). The management of changes made to a MIS hardware, software, firmware, documentation, tests, test fixtures, test documentation, communications interfaces, operating procedures, installation structures, and all changes thereto throughout the development and operational life-cycle of the MIS.

Contingency Plan. The documented organized process for implementing emergency response, back-up operations, and post-disaster recovery, maintained for a MIS as part of its security program, to ensure the availability of critical assets (resources) and facilitate the continuity of operations in an emergency. See also: Disaster Recovery.

Contingency Planning. The process of preparing a documented organized approach for emergency response, back-up operations, and post-disaster recovery that will ensure the availability of critical MIS resources and facilitate the continuity of MIS operations in an emergency. See also: Contingency Plan, Disaster Recovery.

Controlled Access Protection (C2). A category of safeguard criteria as defined in the Trusted Computer Security Evaluation Criteria (TCSEC). It includes identification and authentication, accountability, auditing, object reuse, and specific access restrictions to data. This is the minimum level of control for SBU information.

Conventional Encryption. A form of cryptosystem in which encryption and decryption are performed using the same key. See also: Symmetric Encryption.

COTS. See: Commercial-Off-The-Shelf.

Countermeasures. See: Security Safeguards

Cracker. See: Hacker.

Critical Assets. Those assets, which provide direct support to the organization's ability to sustain its mission. Assets are critical if their absence or unavailability would significantly degrade the ability of the organization to carry out its mission, and when the time that the organization can function without the asset is less than the time needed to replace the asset.

Critical processing. Any applications, which are so important to an organization, that little or no loss of availability is acceptable; critical processing must be defined carefully during disaster and contingency planning. See also: Critical Assets

Cryptanalysis. The branch of cryptology dealing with the breaking of a cipher to recover information, or forging encrypted information what will be accepted as authentic.

Cryptography. The branch of cryptology dealing with the design of algorithms for encryption and decryption, intended to ensure the secrecy and/or authenticity of messages.

Cryptology. The study of secure communications, which encompasses both cryptography and cryptanalysis.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

DAC. See: Discretionary Access Control, C2, TCSEC.

DASD (Direct Access Storage Device). A physical electromagnetic data storage unit used in larger computers. Usually these consist of cylindrical stacked multi-unit assemblies, which have large capacity storage capabilities.

Data. A representation of facts, concepts, information, or instructions suitable for communication, interpretation, or processing. It is used as a plural noun meaning "facts or information" as in: These data are described fully in the appendix, or as a singular mass noun meaning "information" as in: The data is entered into the computer. [Random House Webster's College Dictionary, 1994]

Data Encryption Standard (DES). Data Encryption Standard is an encryption block cipher defined and endorsed by the U.S. government in 1977 as an official standard (FIPS PUB 59). Developed by IBM®, it has been extensively studied for over 15 years and is the most well known and widely used cryptosystem in the world. See also: Capstone, Clipper, RSA, Skipjack.

Data integrity. The state that exists when computerized data are the same as those that are in the source documents and have not been exposed to accidental or malicious alterations or destruction. It requires that the MIS assets and transmitted information be capable of modification only by authorized parties. Modification includes writing, changing, changing status, deleting, creating, and the delaying or replaying of transmitted messages. See also: Integrity, System integrity.

Deciphering. The translation of encrypted text or data (called ciphertext) into original text or data (called plaintext). See also: Decryption.

Decryption. The translation of encrypted text or data (called ciphertext) into original text or data (called plaintext). See also: Deciphering.

Dedicated Security Mode. An operational method when each user with direct or indirect individual access to a computer system, its peripherals, and remote terminals or hosts has a valid personnel security authorization and a valid need-to-know for all information contained within the system.

DES. See: Data Encryption Standard See also: Capstone, Clipper, RSA, Skipjack.

Dedicated System. A system that is specifically and exclusively dedicated to and controlled for a specific mission, either for full time operation or a specified period of time. See also: Dedicated Security Mode.

Denial of Service. The prevention of authorized access to resources or the delaying of time-critical operations. Refers to the inability of a MIS system or any essential part to perform its designated mission, either by loss of, or degradation of operational capability.

Department of Defense (DOD) Trusted Computer System Evaluation Criteria. The National Computer Security Center (NCSC) criteria intended for use in the design and evaluation of systems that will process and/or store sensitive (or classified) data. This document contains a uniform set of basic requirements and evaluation classes used for assessing the degrees of assurance in the effectiveness of hardware and software security controls built in the design and evaluation of MIS. See also: C2, Orange Book, TCSEC.

Designated Security Officer. The person responsible to the designated high level manager for ensuring that security is provided for and implemented throughout the life-cycle of a MIS from the beginning of the system concept development phase through its design, development, operations, maintenance, and disposal.

Digital Signature Standard. DSS is the Digital Signature Standard, which specifies a Digital Signature Algorithm (DSA), and is part of the U.S. government's Capstone project. It was selected by NIST and NSA to be the digital authentication standard of the U.S. government, but has not yet been officially adopted. See also: Capstone, Clipper, RSA, Skipjack.

Disaster Recovery Plan. The procedures to be followed should a disaster (fire, flood, etc.) occur. Disaster recovery plans may cover the computer center and other aspects of normal organizational functioning. See also: Contingency Plan.

Discretionary Access Control (DAC). A means of restricting access to objects based on the identity of subjects and/or groups to which they belong or on the possession of an authorization granting access to those objects. The controls are discretionary in the sense that a subject with a certain access permission is capable of passing that permission (perhaps indirectly) onto any other subject.

Discretionary processing. Any computer work that can withstand interruption resulting from some disaster.

DSS. See: Digital Signature Standard, Capstone, Clipper, RSA, Skipjack.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Emergency Response. A response to emergencies such as fire, flood, civil commotion, natural disasters, bomb threats, etc., in order to protect lives, limit the damage to property and the impact on MIS operations.

Enciphering. The conversion of plaintext or data into unintelligible form by means of a reversible translation that is based on a translation table or algorithm. See also: Encryption.

Encryption. The conversion of plaintext or data into unintelligible form by means of a reversible translation that is based on a translation table or algorithm. See also: Enciphering.

Entity. Something that exists as independent, distinct or self-contained. For programs, it may be aNYeNething that can be described using data, such as an employee, product, or invoice. Data associated with an entity are called attributes. A product's price, weight, quantities in stock, and description all constitute attributes. It is often used in describing distinct business organizations or government agencies.

Environment. The aggregate of external circumstance, conditions, and events that affect the development, operation, and maintenance of a system. Environment is often used with qualifiers such as computing environment, application environment, or threat environment, which limit the scope being considered.

Evaluation. Evaluation is the assessment for conformance with a preestablished metric, criteria, or standard.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Firewall. A collection of components or a system that is placed between two networks and possesses the following properties: 1) all traffic from inside to outside, and vice-versus, must pass through it; 2) only authorized traffic, as defined by the local security policy, is allowed to pass through it; 3) the system itself is immune to penetration. [CHES94]

Firmware. Equipment or devices within which computer programming instructions necessary to the performance of the device's discrete functions are electrically embedded in such a manner that they cannot be electrically altered during normal device operations.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Gateway. A machine or set of machines that provides relay services between two networks.

General Support System. An interconnected set of information resources under the same direct management control which shares common functionality. A system normally includes hardware, software, information, data, applications, communications, and people. A system can be, for example, a local area network (LAN) including smart terminals that support a branch office, an agency-wide backbone, a communications network, a departmental data processing center including its operating system and utilities, a tactical radio network, or a shared information processing service organization (IPSO).

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Hack. Any software in which a significant portion of the code was originally another program. Many hacked programs simply have the copyright notice removed. Some hacks are done by programmers using code they have previously written that serves as a boilerplate for a set of operations needed in the program they are currently working on. In other cases it simply means a draft. Commonly misused to imply theft of software. See also: Hacker

Hacker. Common nickname for an unauthorized person who breaks into or attempts to break into a MIS by circumventing software security safeguards. Also, commonly called a "cracker." See also: Intruder, Hack.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Information Security. The protection of information systems against unauthorized access to or modification of information, whether in storage, processing or transit, and against the denial of service to authorized users or the provision of service to unauthorized users, including those measures necessary to detect, document, and counter such threats.

Information Systems Security (INFOSEC). The protection of information assets from unauthorized access to or modification of information, whether in storage, processing, or transit, and against the denial of service to authorized users or the provision of service to unauthorized users, including those measures necessary to detect, document, and counter such threats. INFOSEC reflects the concept of the totality of MIS security. See also: Computer Security.

Identification. The process that enables recognition of an entity by a system, generally by the use of unique machine-readable user names.

Information Security Officer (ISO). The person responsible to the designated high level manager for ensuring that security is provided for and implemented throughout the life-cycle of a MIS from the beginning of the system concept development phase through its design, development, operations, maintenance, and disposal.

Integrity. A subgoal of computer security which ensures that: 1) data is a proper representation of information; 2) data retains its original level of accuracy; 3) data remains in a sound, unimpaired, or perfect condition; 3) the MIS perform correct processing operations; and 4) the computerized data faithfully represent those in the source documents and have not been exposed to accidental or malicious alteration or destruction. See also: Data integrity, System integrity.

Interconnected System. An approach in which the network is treated as an interconnection of separately created, managed, and accredited MIS.

Internet. A world-wide "network of networks" that uses Transmission Control Protocol/Internet Protocol (TCP/IP) for communications.

Intruder. An individual who gains, or attempts to gain, unauthorized access to a computer system or to gain unauthorized privileges on that system. See also: Hacker

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Key Distribution Center. A system that is authorized to transmit temporary session keys to principals (authorized users). Each session key is transmitted in encrypted form, using a master key that the key distribution shares with the target principal. See also: DSS, Encryption, Kerberos.

Kerberos. Kerberos is a secret-key network authentication system developed by MIT and uses DES for encryption and authentication. Unlike a public-key authentication system, it does not produce digital signatures. Kerberos was designed to authenticate requests for network resources rather than to authenticate authorship of documents. See also: DSS.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Label. The marking of an item of information that reflects its information security classification. An internal label is the marking of an item of information that reflects the classification of that item within the confines of the medium containing the information. An external label is a visible or readable marking on the outside of the medium or its cover that reflects the security classification information resident within that particular medium. See also: Confidential.

LAN (Local Area Network). An interconnected system of computers and peripherals. LAN users can share data stored on hard disks in the network and can share printers connected to the network.

Least Privilege. The principle that requires each subject be granted the most restrictive set of privileges needed for the performance of authorized tasks. The application of this principle limits the damage that can result from accident, error, or unauthorized use.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Malicious Code. Software or firmware that is intentionally included in a MIS for an unauthorized purpose. See also: Trapdoor, Trojan Horse, Virus, Worm.

Major Application. An application that requires special attention to security due to the risk and magnitude of the harm resulting from the loss, misuse, or unauthorized access to or modification of the information in the application. See also: Application, Process.

Management Information System (MIS). A MIS is an assembly of computer hardware, software, and/or firmware configured to collect, create, communicate, compute, disseminate, process, store, and/or control data or information. Examples include: information storage and retrieval systems, mainframe computers, minicomputers, personal computers and workstations, office automation systems, automated message processing systems (AMPSs), and those supercomputers and process control computers (e.g., embedded computer systems) that perform general purpose computing functions.

MIS Owner. The official who has the authority to decide on accepting the security safeguards prescribed for a MIS and is responsible for issuing an accreditation statement that records the decision to accept those safeguards. See also: Accrediting Authority (AA), Application Owner, Process Owner,

MIS Security. Measures or controls that safeguard or protect a MIS against unauthorized (accidental or intentional) disclosure, modification, destruction of the MIS and data, or denial of service. MIS security provides an acceptable level of risk for the MIS and the data contained in it. Considerations include: 1) all hardware and/or software functions, characteristics, and/or features; 2) operational procedures, accountability procedures, and access controls at all computer facilities in the MIS; 3) management constraints; 4) physical structures and devices; and 5) personnel and communications controls.

Microprocessor. A semiconductor central processing unit contained on a single integrated circuit chip.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

National Computer Security Center (NCSC). The government agency part of the National Security Agency (NSA) and that produces technical reference materials relating to a wide variety of computer security areas. It is located at 9800 Savage Rd., Ft. George G. Meade, MD.

National Telecommunications and Information Systems Security Policy. Directs Federal agencies, by July 15, 1992, to provide automated Controlled Access Protection (C2 level) for MIS, when all users do not have the same authorization to use the sensitive information.

Network. A communications medium and all components attached to that medium whose responsibility is the transference of information. Such components may include MISs, packet switches, telecommunications controllers, key distribution centers, and technical control devices.

Network Security. Protection of networks and their services unauthorized modification, destruction, disclosure, and the provision of assurance that the network performs its critical functions correctly and there is no harmful side-effects.

NIST. National Institute of Standards and Technology in Gaithersburg, MD. NIST publishes a wide variety of materials on computer security, including FIPS publications.

Non-Repudiation. Method by which the sender is provided with proof of delivery and the recipient is assured of the sender’s identity, so that neither can later deny having processed the data.

Nonvolatile Memory Units. Devices which continue to retain their contents when power to the unit is turned off (e.g. bobble memory, Read-Only Memory-ROM).

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Object. A passive entity that contains or receives information. Access to an object potentially implies access to the information it contains. Examples of objects are records, blocks, pages, segments, files, directories, directory tree, and programs as well as bits, bytes, words, fields, processors, video displays, keyboards, clocks, printers, network nodes, etc.

Object Reuse. The reassignment to some subject of a medium (e.g., page frame, disk sector, or magnetic tape) that contained one or more objects. To be securely reassigned, no residual data from previously contained object(s) can be available to the new subject through standard system mechanisms.

[NCSC-TG-025] See also: Remanence.

Off-line. Pertaining to the operation of a functional unit when not under direct control of a computer. See also: On-line.

On-line. Pertaining to the operation of a functional unit when under the direct control of a computer. See also: Off-line.

Orange book. Named because of the color of its cover, this is the DoD Trusted Computer System Evaluation Criteria, DoD 5200.28-STD. It provides the information needed to classify computer systems as security levels of A, B, C, or D, defining the degree of trust that may be placed in them. See also: C2, TCSEC.

Overwrite Procedure. A process, which removes or destroys data recorded on a computer storage medium by writing patterns of data over, or on top of, the data stored on the medium.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Password. A protected and private character string used to authenticate an AIS user.

Personnel Security. The procedures established to ensure that all personnel who have access to any sensitive information have all required authorities or appropriate security authorizations.

Physical Security. The application of physical barriers and control procedures as preventative measures or safeguards against threats to resources and information.

Privacy Act of 1974. A US law permitting citizens to examine and make corrections to records the government maintains. It requires that Federal agencies adhere to certain procedures in their record keeping and interagency information transfers. Reference: FIPS Pub. 41, 05/30/75 (Implementing the Privacy Act of 1975), and the Privacy Act of 1974, As Amended. See also: System of Records.

Private Branch Exchange. Private Branch eXchange (PBX) is a telephone switch providing speech connections within an organization, while also allowing users access to both public switches and private network facilities outside the organization. The terms PABX, PBX, and PABX are used interchangeably.

Process. An organizational assignment of responsibilities for an associated collection of activities that takes one or more kinds of input to accomplish a specified objective that creates an output that is of value.

Process Owner. The official who defines the process parameters and its relationship to other Customs processes. The process owner has Accrediting Authority (AA) to decide on accepting the security safeguards prescribed for the MIS process and is responsible for issuing an accreditation statement that records the decision to accept those safeguards. See also: Application Owner.

Public Law 100-235. Established minimal acceptable standards for the government in computer security and information privacy. See also: Computer Security Act of 1987.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Rainbow Series. A series of documents published by the National Computer Security Center (NCSC) to discuss in detail the features of the DoD, Trusted Computer System Evaluation Criteria (TCSEC) and provide guidance for meeting each requirement. The name "rainbow" is a nickname because each document has a different color of cover. See also: NCSC.

Read. A fundamental operations that results only in the flow of information from an object to a subject.

Recovery. The process of restoring a MIS facility and related assets, damaged files, or equipment so as to be useful again after a major emergency which resulted in significant curtailing of normal ADP operations. See also: Disaster Recovery.

Remanence. The residual information that remains on storage media after erasure. For discussion purposes, it is better to characterize magnetic remanence as the magnetic representation of residual information that remains on magnetic media after the media has been erased. The magnetic flux that remains in a magnetic circuit after an applied magnetomotive force has been removed. [Random House Webster's College Dictionary, 1994] See also: Object Reuse.

Residual Risk. The part of risk remaining after security measures have been implemented.

Risk Analysis. The process of identifying security risks, determining their magnitude, and identifying areas needing safeguards. An analysis of an organization's information resources, its existing controls, and its remaining organizational and MIS vulnerabilities. It combines the loss potential for each resource or combination of resources with an estimated rate of occurrence to establish a potential level of damage in dollars or other assets. See also: Risk Assessment, Risk Management.

Risk Assessment. Process of analyzing threats to and vulnerabilities of an MIS to determine the risks (potential for losses), and using the analysis as a basis for identifying appropriate and cost-effective measures. See also: Risk Analysis, Risk Management.

Note: Risk analysis is a part of risk management, which is used to minimize risk by specifying security measures commensurate with the relative values of the resources to be protected, the vulnerabilities of those resources, and the identified threats against them. The method should be applied iteratively during the system life-cycle. When applied during the implementation phase or to an operational system, it can verify the effectiveness of existing safeguards and identify areas in which additional measures are needed to achieve the desired level of security. There are numerous risk analysis methodologies and some automated tools available to support them.

Risk Management. The total process of identifying, measuring, controlling, and eliminating or minimizing uncertain events that may affect system resources. Risk management encompasses the entire system life-cycles and has a direct impact on system certification. It may include risk analysis, cost/benefit analysis, safeguard selection, security test and evaluation, safeguard implementation, and system review. See also: Risk Analysis, Risk Assessment

ROM. Read Only Memory. See also: Nonvolatile Memory Units.

RSA. A public-key cryptosystem for both encryption and authentication based on exponentiation in modular arithmetic. The algorithm was invented in 1977 by Rivest, Shamir, and Adelman and is generally accepted as practical or secure for public-key encryption. See also: DES, Capstone, Clipper, RSA, Skipjack.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Safeguards. Countermeasures, specifications, or controls, consisting of actions taken to decrease the organizations existing degree of vulnerability to a given threat probability, that the threat will occur.

Security Incident. A MIS security incident is any event and/or condition that has the potential to impact the security and/or accreditation of a MIS and may result from intentional or unintentional actions. See also: Security Violation.

Security Policy. The set of laws, rules, directives, and practices that regulate how an organization manages, protects, and distributes controlled information.

Security Requirements. Types and levels of protection necessary for equipment, data, information, applications, and facilities to meet security policies.

Security Specifications. A detailed description of the security safeguards required to protect a system.

Security Violation. An event, which may result in disclosure of sensitive information to, unauthorized individuals, or that results in unauthorized modification or destruction of system data, loss of computer system processing capability, or loss or theft of any computer system resources. See also: Security Incident.

Site. Usually a single physical location, but it may be one or more MIS that are the responsibility of the DSO. The system may be a stand-alone MIS, a remote site linked to a network, or workstations interconnected via a local area network (LAN).

Skipjack. A classified NSA designed encryption algorithm contained in the Clipper Chip. It is substantially stronger than DES and Intended to provide a Federally mandated encryption process, which would enable law enforcement agencies to monitor and wiretap private communications. See also: Capstone, DES, Clipper, RSA, Skipjack.

Standard Security Procedures. Step-by-step security instructions tailored to users and operators of MIS that process sensitive information.

Standalone System. A single-user MIS not connected to any other systems.

Symmetric Encryption. See: Conventional Encryption.

System. See: Management Information System, MIS

System Integrity. The attribute of a system relating to the successful and correct operation of computing resources. See also: Integrity.

System of Records. A group of any records under the control of the Department from which information is retrieved by the name of an individual, or by some other identifying number, symbol, or other identifying particular assigned to an individual. See also: Privacy Act of 1974

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

TCSEC. Trusted Computer System Evaluation Criteria (TCSEC). DoD 5200.28-STD, National Institute of Standards and Technology (NIST), Gaithersburg, MD., 1985. Establishes uniform security requirements, administrative controls, and technical measures to protect sensitive information processed by DoD computer systems. It provides a standard for security features in commercial products and gives a metric for evaluating the degree of trust that can be placed in computer systems for the securing of sensitive information. See also: C2, Orange Book.

Test Condition. A statement defining a constraint that must be satisfied by the program under test.

Test Data. The set of specific objects and variables that must be used to demonstrate that a program produces a set of given outcomes. See also: Disaster Recovery, Test program.

Test Plan. A document or a section of a document which describes the test conditions, data, and coverage of a particular test or group of tests. See also: Disaster Recovery, Test Condition, Test Data, Test procedure (Script).

Test procedure (Script). A set of steps necessary to carry out one or a group of tests. These include steps for test environment initialization, test execution, and result analysis. The test procedures are carried out by test operators.

Test program. A program which implements the test conditions when initialized with the test data and which collects the results produced by the program being tested. See also: Disaster Recovery, Test Condition, Test Data, Test procedure (Script).

Threat. An event, process, activity (act), substance, or quality of being perpetuated by one or more threat agents, which, when realized, has an adverse effect on organization assets, resulting in losses attributed to:

  • Direct loss
  • Related direct loss
  • Delays or denials
  • Disclosure of sensitive information
  • Modification of programs or data bases
  • Intangible, i.e., good will, reputation, etc.

Threat Agent. Any person or thing, which acts, or has the power to act, to cause, carry, transmit, or support a threat. See also: Threat.

Trapdoor. A secret undocumented entry point into a computer program, used to grant access without normal methods of access authentication. See also: Malicious Code.

Trojan Horse. A computer program with an apparently or actually useful function that contains additional (hidden) functions that surreptitiously exploit the legitimate authorizations of the invoking process to the detriment of security. See also: Malicious Code. Threat agent

Security Safeguards (countermeasures). The protective measures and controls that are prescribed to meet the security requirements specified for a system. Those safeguards may include, but are not necessarily limited to: hardware and software security features; operating procedures; accountability procedures; access and distribution controls; management constraints; personnel security; and physical structures, areas, and devices. Also called safeguards or security controls.

Trusted Computer Base (TCB). The totality of protection mechanisms within a computer system, including hardware, firmware, and software, the combination of which is responsible for enforcing a security policy. A TCB consists of one or more components that together enforce a security policy over a product or system. See also: C2, TCSEC, Orange Book.

Trusted Computing System. A computer and operating system that employs sufficient hardware and software integrity measures to allow its use for simultaneously processing a range of sensitive information and can be verified to implement a given security policy.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

UPS (Uninterruptible Power Supply). A system of electrical components to provide a buffer between utility power, or other power source, and a load that requires uninterrupted, precise power. This often includes a trickle-charge battery system which permits a continued supply of electrical power during brief interruption (blackouts, brownouts, surges, electrical noise, etc.) of normal power sources.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Verification. The process of comparing two levels of system specifications for proper correspondence.

Virus. Code imbedded within a program that causes a copy of itself to be inserted in one or more other programs. In addition to propagation the virus usually performs some unwanted function. Note that a program need not perform malicious actions to be a virus; it need only infect other programs. See also: Malicious Code.

Virtual Private Network

Virtual

Vulnerability. A weakness, or finding that is non-compliant, non-adherence to a requirement, a specification or a standard, or unprotected area of an otherwise secure system, which leaves the system open to potential attack or other problem.

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

WAN (Wide Area Network). A network of LANs, which provides communication, services over a geographic area larger than served by a LAN.

WWW. See: World Wide Web

World Wide Web. An association of independent information databases accessible via the Internet. Often called the WEB, WWW, or W.

Worm. A computer program that can replicate itself and send copies from computer to computer across network connections. Upon arrival, the worm may be activated to replicate and propagate again. In addition to propagation, the worm usually performs some unwanted function. See also: Malicious Code.

Write. A fundamental operation that results only in the flow of information from a subject to an object.

Source references:
Glossary of Computer Security Terminology, developed by the National Security Telecommunications and Information Systems Security Committee (NSTISSC) and published by NIST as NISTIR 4659. Available from NTIS as PB92-112259.

Glossary for Computer Security Terms. National Technical Information Service (NTIS), FIPS PUB 39, Springfield, VA., 02/15/76. Withdrawn 4/93. Replacement is FIPS 11-3.

Introduction to Certification and Accreditation. National Computer Security Center (NCSC), NCSC-TG-029, Ver. 1, NSA, Ft. George G. Meade, MD., January 1994.

Treasury Security Manual, TD P 71-10, Appendix B, 1993.