Lab Logo
Lawrence Berkeley National Laboratory
Main | Search | Phone | Today | Notice
 
Office of the Chief Information Officer: Rosio Alvarez, PhD
Home | Policy | Privacy | Assurance | Federal Enterprise Architecture | CPIC | IT Division | Organizations and Committees

 

226 Cyber Security Assurance
     
Summary, Keywords, Notes   IT Assurance: Plan

1. Introduction

2. Summary of Core Elements

3. Conclusion

4. Crosswalk to DOE Directives

 

 

 

University of California Cyber Assurance Plan

Note: The controlled current version of this document is maintained as an appendix to the approved LBL Cyber Security Program Plan. Contact cppm@lbl.gov if you need the current version/

Cyber Security Assurance Plan Supplement Approved Version (5-11-08)


This document is an appendix to the LBL Computer Security Program Plan and supports the University of California Assurance Plan for Lawrence Berkeley National Laboratory (LBNL PUB-5520).


This contract reflects the application of performance-based contracting approaches and techniques which emphasize results/outcomes and minimize “how to” performance descriptions.  The Contractor has the responsibility for total performance under the contract, including determining the specific methods for accomplishing the work effort, performing quality control, and assuming accountability for accomplishing the work under the contract.” –C31

Introduction:

The University manages an extraordinary complex and cutting edge information technology environment at Berkeley Lab, as a component of its performance under the contract.  Computation has been called “the third pillar” of modern science, and virtually all aspects of the scientific endeavor are computer mediated.  LBL has been a part of this transformation since the earliest days of computers, and many of the tools created at the Lab are foundational in the modern internet.  These kinds of technologies carry unique cyber security risks which must be appropriately managed to balance scientific inquiry and freedom with the need to appropriately protect resources and protect scientific work.
As a performance based contractor, the University has total responsibility under the contract for the protection of computer resources deployed to carry out the mission of Berkeley Lab.   The University chooses and implements the appropriate controls and provides, for itself, assurance that the system is functioning as intended (that is, that controls are working as designed, and that the set of controls is appropriately protecting the institution).
The University develops an ecology of assurance mechanisms to ensure appropriate visibility into the management of cyber security.  At the same time, the outputs from a subset of these assurance mechanisms should provide transparency to the Department of Energy (through the Berkeley Site Office), that the assurance mechanisms themselves are adequate and that the ecology of controls is functioning as intended. 

Summary of Core Elements:

Performance metrics.
The Cyber Security program includes the development of cyber performance metrics, currently under Section 8 of the PEMP.  Cyber metrics are designed annually with the Berkeley Site Office, supported by SC-wide guidance, and are designed to reflect real enhancements and efforts related to efficiently protecting LBL resources and encouraging integrated safeguards and security management. 
In addition, the Cyber Security Program maintains incident data, unrelated to contract performance metrics, which evaluates the extent and severity of incidents experienced at the Lab, and allows the program to adjust its protections accordingly.

Assessments and Reviews.
NIST and DOE guidance require annual reviews of controls through self-assessment for most enclaves at Berkeley Lab.  In addition, an annual external (non-system owner) review is required for the business systems. 
In parallel, the Laboratory uses many different levels of audits, internal reviews, and automated reviews to provide assurance.  In brief, the levels of reviews are as follows:

Self-Assessment
Continuous Monitoring: As outlined in the CSPP, many aspects of the environment are subject to continuous monitoring for security and compliance with policy.  Critical tools used in this area are Vulnerability Scanning, the use of Bro to detect policy violations, and tools to detect configuration changes in institutional services.
Incident Reviews: The cyber security program conducts reviews of each incident.  See also “continuous improvement”.
Annual Controls Reviews: Annually, the Laboratory does a NIST 800-53 based review of all security controls, as well as a complete risk assessment. 
Annual Cyber Security Risk and Self Assessments: The Office of the CIO undertakes annual risk assessment and self assessments of its information technology posture.  The self-assessment process undertakes to identify to the functioning of technical, administrative, and operational controls, while the risk-assessment process is designed to provide transparency to DOE and the Laboratory Community on current and emerging threats, as well as residual risks from our security posture.

Internal Review
Internal Audit: The University of California core audit program includes cyber and information technology reviews.  The audit program typically includes at least one such audit a year.

External Review
Peer Reviews: The Laboratory makes targeted use of peer reviews on an as needed basis.  In the past three years, one peer review of ESnet security was conducted, and another peer review of the 800-53 Certification and Accreditation process was conducted.


External Audit: While the University does not control such efforts, the Cyber Security Program has been under nearly continuous review for the past three years from external auditors, and such reviews are utilized as part of our ongoing analysis of the program.

Issues Management.
The Cyber Security Program follows the LBNL Issues Management Program (LBNL PUB-5519) for managing issues. The Cyber Security Program utilizes the Laboratory’s Corrective Action Tracking System (CATS) for the management of corrective actions.  The system provides a multi-layered tracking and reporting mechanism for the management of corrective actions.


Corrective actions are identified through self-assessments, incident assessments, and audits and reviews.   Major corrective actions are also reported to DOE (through the Office of Science) through the Plan of Actions and Milestones Process or POAMs.  POAMs are an integral part of quarterly Federal Information Security Management Act reporting, and typically tracked within the LBL CATS system as well.


All cyber security incidents are tracked and identified with the goal of identifying proximate and root causes.

Lessons learned.
Cyber security incident reports follow defined reporting channels, with primary reporting to the Department of Energy’s Computer Incident Advisory Center (CIAC) or equivalent, with copies to Counterintelligence, the Office of the Inspector General, and the Berkeley Site Office.  Incident reports are shared internally with key stakeholders to assure broad knowledge of current risks.    Likewise, the Laboratory’s cyber security staff remains abreast of new trends in attacks and threats primarily from public sector sources, but also from DOE sources such as CIAC alerts.  As appropriate, briefing and discussions of cyber security incidents are entered into the LBNL Lessons Learned and Best Practices database and disseminated to target staff. These inputs, along with broad based incident review, allow the Laboratory to adjust its protection mechanisms continuously to ensure optimal protection.

The Continuous Monitoring Program.
The CSPP provides discussion of the cyber security program, which is inherently an assurance program for the Laboratory.  A few of the critical assurance systems are defined here, but additional information is available in the CSPP and supporting documents.


 

Outcome

Assurance System

How we demonstrate the system is working?

Systems are securely configured and meet requirements.

Vulnerability scanning, continuous and on demand, to identify insecurely configured or vulnerable systems with actions in response to a finding of vulnerability.

On request access to blocked host history lists, web site information with current scans.

Systems are not infected or attacking other systems.

Monitoring systems provide indications of vulnerable systems.

On request access to Bro logs and incident investigation reports.

Attackers cannot search for targets indiscriminately.

Monitoring systems (Bro, Syslog, Netflow) provide defenses against indiscriminate attackers.

On request access to Bro logs.

Users are trained.

LBL Training Database

Report outputs on training rates and percentages as part of PEMP.

Security systems are operational.

Monitoring and alerting systems to detect failures in critical cyber defense systems.

On request access to Nagios and related logging reports.

DOE and LBL jointly understands residual risk.

Annual risk assessment and ongoing briefings as necessary.  Cost-benefit analysis of cyber program.

Dialogue with site office.


 

Reporting.
Reporting is performed as detailed in PUB-5520, section 2.4. Primary cyber and information technology reporting mechanisms include:

  • Cyber security incident reports
  • Computer Protection  Implementation Committee
  • Cyber Security Enclave Security Team

Qualifications for Staff Who Perform Assurance Functions .
Internal Audit: All UC/ LBNL Internal Audit Services staff who perform cyber and information technology reviews maintain professional internal audit certification.
Cyber Security: All UC/LBNL cyber security staff who perform self-assessment  receive formal training through an annual briefing session on the self assessment activities  and, as cyber security subject matter experts, keep abreast of current technical developments.  Their work is overseen by the Chief Information Officer and Computer Protection Program Manager, and the work can draw, when necessary, on the collective expertise of other Labs and other UC campuses, including through Peer Review. 

 

Conclusion

The University creates and manages a diverse cyber environment in support of LBL’s scientific and operational goals.  The security of these systems is important, but must be risk-based and cost-benefit, to ensure against over-protection at the expense of scientific productivity.  The University’s approach, as defined here, in the Cyber Security Program Plan, and in the University’s Assurance Plan, formulate robust systems of management assurance which help to ensure strong management controls of cyber security.  At the same time, the existence of and output of these assurance systems, help to sustain a relationship between the University and the Department that supports the values of Performance Based Management.

Appendix A
LBNL Conformance with DOE Oversight Policy 

Requirements and Standards

The LBNL Assurance Program, as documented in the UC Assurance Plan for LBNL, conforms to all requirements identified in the Contractor Requirements Document (CRD) of DOE Order 226.1, Department of Energy Oversight Policy.

DOE O 226.1, Attach. 2 (CRD), Appendix A

UC Assurance Plan

Cyber Security Assurance Systems

1.a  Comprehensive and integrated contractor assurance system established
(1) Identify deficiencies and opportunities for improvement
(2) Report deficiencies to responsible managers
(3) Implement effective corrective actions

Objectives and Applicability
Assurance Program
Section 2.3, Assessment
Section 2.4, Reporting
Section 3.2, Corrective Action

 

Integrate with UC Assurance Plan for LBNL.

Identify deficiencies through self assessment and risk assessment.  (Cyber Security Assurance Plan (CSAP) supplement, section 2b)

Report deficiencies to managers via risk assessment, CPIC, ITAC, and direct reporting by CIO where necessary. (UC Assurance Plan)

Corrective action implementation tracked via POAM process or CATS. (CSAP supl. Sec. 2c)

1.b Assurance activities must include:
(1) assessments
(2) incident/ event reporting, including accident investigations
(3) worker feedback mechanisms
(4) issues management, including causal analysis and corrective action management
(5) lessons learned programs, and
(6) performance indicators/ measures.

See details below.

See below.

1.c Contractor assurance system data must be documented and available to DOE.

Section 2.3, Assessment
Section 2.4, Reporting
Section 3.2, Corrective Action

Cyber Security Program Plan, CSAP supplement, and UC Assurance Plan.

1.d. Contractors will establish processes for corporate audits, third-party certifications, or external reviews.

Section 2.3, Assessment

UC Internal Audit Plan contains core audits required by the University which cover IT and Cyber Security. (CSAP supl. Sec.2b)

External reviews are conducted at least triennieally as part of the Authority to Operate process. (CSAP supl. Sec.2b)

 

1.e. Program effectiveness can be certified by third parties as a complement to internal assurance systems.

Third party certification is not required. However, several LBNL functions have achieved third party certifications. These certifications include Earned Value Management System, Environmental Management System, Accreditation Association for Ambulatory Health Care, and DOE Laboratory Accreditation Program accreditation for bioassay and external dosimetry.

There is no current certification that is appropriate to the R&E computer environment, however, the program is externally validated as part of the Authority to Operate.

1.f. Contractors must monitor and evaluate all work performed under their contracts, including the work of subcontractors.

Objectives and Applicability

Most cyber security control mechanisms apply to all systems including those of subcontactors when on the network. Other mechanisms flow through contract SPs.

2.a Self-assessment is used to evaluate performance.
(1)  Management self-assessments are performed by contractor management, and are developed based on the nature of the facility/ activity being assessed and the hazards and risks to be controlled.
(2)  Self-assessments involve workers, supervisors, and managers to encourage identification and resolution of deficiencies at the lowest level practicable

  • Support organizations will perform self-assessments of the performance and the adequacy of their processes
  • Contractor, at all levels, will assess the implementation and adequacy of the processes, including analysis of the collective results of lower-level self-assessments.
  • Self-assessment results will be documented commensurate with the significance of and risks associated with activities being evaluated.

Section 2.3, Assessment
Section 2.3.1, Self-Assessment

Annual self assessment requirements are defined in the Common Controls CSPP. (CSAP supl. Sec.2b)

At the same time, the entire program represents a set of ongoing assurance systems (searching for indications of compromised and vulnerable hosts and assessing the overall posture of the Laboratory).  Tools such as IDS and vulnerability scanning run continuously to create operational awareness of the state of security.

2.b Internal independent assessments will be performed by contractor organizations.
(1)  The assessments will be formally planned and scheduled based on the risk, hazards, and the complexity of the areas assessed.
(2)  Independent evaluators will be appropriately trained and qualified and have knowledge of the areas assessed.
(3)  Reviewers will be dedicated contractor staff, members of external organizations, or both.

  • Although independent assessments are applied to individual activities and processes, they will typically focus on facilities, projects, programs, and management processes used by multiple organizations.
  • Internal independent assessments will concentrate on performance and observation of work activities and the results of process implementation.

Section 2.3, Assessment
Section 2.3.2,Internal Review.
Section 2.3.3, External Review

The University formally schedules core and non-core internal independent audits.  In addition, annual independent assessment of the BSE is scheduled by the Office of the CIO. (CSAP supl. Sec.2b)

3.a  Reportable occurrences that meet occurrence reporting and processing system thresholds and associated corrective actions will be evaluated, documented, and reported as required.

Section 2.4.3, Event Reporting and Analysis

 

Thresholds are defined in the CSPP.  A defined reporting process to CIAC, the site office, the IG and CI is also defined in the CSPP. (UC Assurance Plan)

3.b For activities covered by the Price-Anderson Amendments Act, nuclear worker safety and health issues meeting DOE reporting thresholds should be self-reported.

Section 2.4.3, Event Reporting and Analysis

N/A

3.c Trending analysis of events, accidents, and injuried is performed in accordance with structured/ formal processes.

Section 2.4.3, Event Reporting and Analysis

The NIST based risk assessment utilizes a defined approach in which we track prior incidents for an understanding of risk. (UC Assurance Plan)

4. Worker feedback. DOE contractors will establish and implement processes to solicit feedback from workers andork activities.

Section 2.4.4, Worker Feedback

 

In addition to the Laboratory’s shared systems for worker feedback, the cyber security program relies on Heldesk tickets from users, direct feedback to the program, and the Computer Protection Implementation Committee. (UC Assurance Plan)

5.a Program and performance deficiencies, regardless of their source, must be captured in a system or systems that provide for effective analysis, resolution, and tracking. Issues management must include structured processes for:
(1)  Determining risk, significance, and priority;
(2)  Evaluating the scope and extent of the condition or deficiency;

  • Determining event reportability under applicable requirements;
  • Identifying root causes (applied using a graded approach);
  • Identifying and documenting suitable corrective actions and recurrence controls;
  • Identifying individuals/ organizations responsible for implementing corrective actions;
  • Establishing appropriate milestones for completion of corrective actions, including consideration of significance and risk;
  • Tracking progress toward milestones to ensure timely completion of actions;
  • Verifying that corrective actions are complete;
  • Validating that corrective actions are effectively implemented, using a graded approach;
  • Ensuring that individuals and organizations are accountable for performing their assigned responsibilities.

Section 3.2, Issues Management
Section 3.2.1, Issues Tracking
Section 3.2.3, Effectiveness Review

Issues management and tracking are covered by the CSPP in regards to Incident Report, POA&M tracking, and supplemental LBL Incident Reporting training and guidance.   It also draws from the UC assurance plan.

5.b Issues management will provide a process for rapidly determining the impact of identified weaknesses and taking timely action to address conditions of immediate concern.

Section 3.2.1, Corrective Action Tracking
Section 3.2.3, Extent of Condition Review

Monitoring CATS and distributing POAMs.

5.c Processes for analyzing deficiencies, individually and collectively, must be established to enable the identification of programmatic or systemic issues. Process products will be used by management to monitor progress and optimize allocation of assessment resources.

Section 3.2.2, Data Monitoring and Analysis
Section 3.2.4, Extent of Condition Review

See above.

5.d Sites must have an effective process for communicating issues up the management chain to senior management, using a graded approach that considers hazards and risks.

Section 3.2.2, Data Monitoring and Analysis

See above.

6. Lessons Learned. Formal programs must be established to communicate lessons learned during work activities, process reviews, and event analyses to potential users and applied to future work activities.

Section 3.3, Lessons Learned and Best Practices

See above.

7. Performance Measures. Contractors must identify, monitor, and analyze data measuring the performance of facilities, programs, and organizations. Performance indicator data must be considered in allocating resources, establishing goals, identifying performance trends, identifying potential problems, and applying lessons learned and good practices.

Section 2.2, Performance Metrics
Section 2.4.1, Assurance Reports
Section 2.4.2, Annual Contract Self-Appraisal Report

See PEMP.

 

 

     
     

 

 

  Home | Policy | Privacy | Assurance | Federal Enterprise Architecture | CPIC | IT Division | Organizations and Committees  
 
Lab Logo
Lawrence Berkeley National Laboratory
University of California: It Starts Here
 
This page is and all subsequent pages are covered by the University's Privacy and Security Notice and Policies