Data Security Guidelines and Methodologies--White Paper
This article applies to: Security Essentials for IT Professionals
Daniel Adinolfi, CISSP
Senior Security Engineer
Cornell Information Security Office
Data security practices must be factored into the design and implementation of any information service. It is in the best interests of the people whose personal data we maintain that we prevent disclosure of that data to unauthorized parties. Also, we feel that it is in our best interests to protect institutional data, ensuring its integrity and availability. Finally, because of data privacy laws such as FERPA, HIPAA, GLBA, or the New York State Security and Notification Act, we must work to prevent any loss of data that is regulated by those laws and reduce the effect of any potential compromises.
The confidentiality, integrity, and availability of your data must be a priority for any application that is being purchased or built by local developers.
Confidentiality is the concept that data will only be viewable by those who are explicitly permitted to view it.
Integrity is the concept that data will not change in unexpected or unauthorized ways. Whatever processes or users affect the data will do so predictably and without errors.
Availability is the idea that your application and the data within it will be accessible to the intended audience whenever that access is needed and not accessible to those who do not require access. This differs from confidentiality in that it addresses the uptime of a service and how users communicate with it.
When considering confidentiality, integrity, and availability, the following questions should be asked:
- What would be the consequences if data were to be accessed by someone who is not authorized to access it?
- What would be the consequences if data were modified in a way that was outside the expected mechanisms?
- What would be the consequences if the data or server were made unavailable when it is needed?
As a general rule, for confidentiality and availability, the concept of "least privilege" should be considered: access to data resources should be limited in such a way that only the very least amount of access should be permitted per task and per user. For integrity, testing should be performed before any system is implemented to ensure the data does not become corrupted and regular logging and log analysis should be performed to provide debugging and assistance with incident response. This is part of the larger need to perform a risk analysis weighing the costs of user and process auditing versus the cost of data loss from incidents that involve abuse of access by authorized and unauthorized users.
The exact requirements that you will have for your data and services with regards to confidentiality, integrity, and availability will depend on a number of factors. One factor is the presence of federal, state, or local legislation regulating the data. Also, Cornell and its departments and units have their own set of regulations that must be factored in, specifically those in University Policy Volume 4, "Governance". There is an on-going policy effort to better define data categories and the minimal security requirements for those categories. Data stewards, the Cornell Policy Office, and CIT are collaborating in this effort.
There are certain questions that should be asked before either a product evaluation or internal service design. The answers to these questions will help set the requirements for that evaluation or design. Also, consider the sensitivity of the data in question and whether there should be special requirements for data access and security.
- What entities or individuals will require access to the data or service in question? For example, will users be accessing the data from campus? From home? From specific subnets on campus?
- What is the least amount of data that each of those entities or individuals will require?
- From where will the data be accessed?
- What are the availability requirements for this access? 24x7? Business hours?
- What regulations, if any, affect the data being utilized by this service? For example, student records are covered by FERPA, medical data is covered by HIPAA, and financial data is covered by GLBA.
Though the focus of this document is on the development and implementation of networked services, consideration must be made to all hosts that will hold the sensitive data being processed by those services. This includes database servers, proxies, or user systems, to name a few.
When specific technologies are evaluated, the following should also be considered. The answers to these questions will help indicate how well security has been designed into the service.
- Where will data be stored? How will it be stored?
- What mechanisms will be used to access the data? For example, is this a web-based application? Does it use its own client? What is the protocol used to communicate over the network?
- What logging is available through the operating systems on which the application runs? From the application itself?
- What authentication and authorization mechanisms are available? How are those mechanisms maintained? (Authentication is the process of verifying the identity of the user. Authorization is restricting access based on a user's identity.)
- Who is responsible for the maintenance and upkeep of each component of the service in question? For example, who will maintain the OS, the application, any databases, etc.?
- What can be done to reduce the impact of a disaster affecting the service?
As one can see, some of these questions are related to specific technological issues where others have more to do with management and administrative methodologies. Security controls can be designed to affect both the technical and the administrative sides of a particular service.
From a technical point of view, a number of technologies can be leveraged to secure a service and its data.
- On the network side, packet filtering, via firewalls or router access control lists, can be used to limit what traffic can reach the servers.
- Servers running services should be placed on separate subnets from end user systems.
- Intrusion detection systems can offer logging and monitoring for network-based attacks and unusual network behavior.
- Communications between all the components of the service, both client-server communication and server-server communication, should also be considered. Is sensitive data being sent in such a way that it could be intercepted and compromised?
The hosts themselves should be securely configured, implementing technology to address:
- patch management,
- remote access,
- OS-level integrity verification,
- application integrity verification,
- and secure data storage.
Also, server processes and logs should be monitored and reported on in an organized fashion.
The processes built into and around the service and the policies that are applied to it address the administrative side of the service's security posture. An appropriate password and authorization policy should be created for the service based on the access requirements.
For services that utilize sensitive data, year user awareness classes should be held to ensure the users understand the regulatory and institutional security requirements that apply to the data. Yearly audits of the service should be performed to ensure all the components conform to the security policies assigned to it.
The physical security requirements of the systems running the service must be considered as well. Sufficient heating, cooling, and air conditioning must be provisioned. Also, backup power systems should be implemented to help ensure the availability requirements will be satisfied in the event of power incidents. Also, the systems should be isolated in such a way to ensure only those authorized to physically access them can do so, whether this be through the use of a locked room or a locked rack in a server farm.
The Cornell IT Security Office can assist with identifying the technical requirements and working through specific implementation details that will satisfy any security policy requirements.