The main driver behind software development has always been functionality. Software is meant to provide new or expanded access to data and services, and the requirements for software are typically written with these needs in mind.
Data privacy and integrity, which are often neglected in functional specs, have become non-negotiable business requirements, largely due to regulations that developed in the wake of public exposures over the last five years. These new regulations require that software behaves differently, specifically, in the way that it identifies users, stores sensitive data and records access to system resources. Due to the volume and complexity of regulations, translating their intent and impact into software development is often frustrating for development teams. Let?s look at a few of the most common regulations, explore their impacts on the software development lifecycle, and determine how to handle them as senior managers of information: Got SOX? The Sarbanes-Oxley Act SOX- was signed into law in 2002. In the wake of the various accounting scandals of the period, its purpose was to give investors more confidence in the transparency and accuracy of the financial reporting process of publicly-traded US companies. In a SOX audit, a company must be able to prove that confidential information cannot be exposed to unauthorized entities. Applications must generate sufficient information to be certain that critical data has not been modified, and must show that user roles, file access and all other points of data access are appropriately locked down to only those users that are supposed to access them. The translation of these requirements into developer terms is by no means straightforward, but it is manageable. Executive-level identification of financial management systems provides the basis for development-level understanding of appropriate access and authorization control. This means that, as senior managers, we must identify critical data and acceptable use/access cases for our development teams. Once this happens, it is easier to audit the developed system. HIPAA The Health Insurance Portability and Accountability Act HIPAA- was passed in 1996 by the U.S. Congress as an attempt to guarantee that employees who changed or lost jobs would be able to carry forward their healthcare benefits to a new provider. In order to do this, there was clearly a requirement to uniquely identify people and to associate their health records with them. Since so much of this information is private, and because there was widespread concern that private health information may negatively impact insurability or employability, the HIPAA legislation also established federal regulations that forced doctors, hospitals, other healthcare providers and insurers to meet baseline standards when handling electronic information, such as medical records and medical patient accounts. The security provisions section of HIPAA is comprised of three different sets of requirements, each of which list specific safeguards. These provisions are of primary concern to software teams since they contain specific technical compliance requirements. These technical requirements may look very similar to those listed for SOX since there is a great deal of similarity in the driving requirements from the compliance regulation. In SOX, the drivers are transparency, data integrity and auditability, whereas in HIPAA, the drivers are data confidentiality and auditability. This similarity leads to the natural commonality in the underlying development strategy. As with SOX, HIPAA concerns are best validated and verified by analyzing the application itself. The transformation of the requirements into software is naturally complex, and the multiple points of contact with the data demand that the applications themselves be analyzed for consistency in protecting that data. SB 1386 SB 1386 is a California Senate bill that amended existing privacy laws to include stipulations requiring disclosure of privacy violations. These apply to organizations that maintain personal information on customers and do business in the state of California. The narrow scope and lack of ambiguity in the legal wording has given regulators significant power in their ability to enforce SB 1386 in a variety of cases. In fact, there have already been many documented cases where companies have been found to be in violation. For practical purposes, this bill has two specific compliance steps: Ensure privacy of customer data at all costs and Disclose all cases where personal information that meets the previously mentioned criteria has been reasonably suspected of being improperly disclosed or acquired by an unauthorized person or entity. The most common examples driving disclosure involve the accidental loss of data stored on physical media, e.g., back-up tapes and laptops. In these events, disclosure is required because of the unencrypted state of the data. Use of encryption for data-at-rest and in-transit will minimize a company?s exposure. But companies should beware: encryption is often implemented incorrectly. As such, they should seek guidance and tools to help with implementation and verification of proper encoded data. Understanding compliance can be difficult, and there are very few resources that map regulatory requirements to software development or management requirements. However, the difficulty in understanding the legislation does not lessen its importance. The ramifications of non-compliance are very real risks to a business. The regulations themselves may be complex, but meeting their requirements doesn?t have to be. It boils down to following a few critical steps: Identify which regulations are important for your industry and for the specific application you are assessing Relate the business requirements to your software development process at every phase from requirements analysis to design, development, testing and deployment Follow best practices as appropriate in the areas of confidentiality, integrity, availability, auditing and logging, and authentication. Source: www.CIOupdate.coma>