יום שבת, 15 ביוני 2019

Aligning NIST Privacy Framework with IAPP’s Privacy Product Categories and Measures



In the 21st century many organizations face challenges to design, operate, or use technologies in ways that are mindful of diverse privacy needs in an increasingly connected and complex environment. Current cutting-edge technologies, which enhance convenience, efficiency and economic growth, are raising further concerns about their impacts on individuals’ privacy. While good cybersecurity practices help manage privacy risk by protecting individual’s information, privacy risks also can arise from how organizations collect, store, use, and share this information to meet their mission or business objectives, as well as how individuals interact with products and services. The use of detailed data about individuals in those new technologies can make protecting their privacy harder. Today new technology solutions are needed to efficiently manage and operationalize data privacy. Many organizations have an increasing reliance on data to drive business, there is an influx of new technologies into the workplace, and there are regulatory requirements to demonstrate ongoing compliance. Two projects are aiming to help with meeting this challenge:

·         Recently the National Institute of Standards and Technology (NIST) announced a collaborative project to develop a voluntary privacy framework to help organizations better identify, assess, prioritize, manage, and communicate privacy risks; bridge the gaps between privacy professionals and senior executives to foster the development of innovative approaches to flexibly and effectively protecting individuals’ privacy without stifling innovation; and increase trust in products and services. NIST’s approach for this framework is based on the successful, open, transparent, and collective approach used to develop the NIST’s Cybersecurity Framework. Unofficially, this new framework’s aim is to create a common vocabulary between lawyers, privacy practitioners, developers, the cybersecurity team and the c-suite to enable true privacy engineering (NIST has mapped the Cyber Security Framework to the Privacy Framework core to assist organizations in identifying similarities and differences and develop a streamlined risk management process for both).
·         In the last several years, the privacy technology market has gone from an emerging space to a full-blown, dynamic ecosystem. With new and robust compliance requirements, many established companies are now part of the privacy technology menu, making for a rich marketplace. To help companies navigate the influx of solutions, the IAPP created the Privacy Tech Vendor Report which encompass product categories in the privacy ecosystem. The report contains information from privacy practitioners that is meant to help companies to decide which are the right privacy product categories that will be the best fit for driving and scaling privacy compliance.

The subcategory level in the NIST Privacy Framework presents privacy controls or capabilities organizations should consider adopting to address privacy risk. These subcategories, which can be aligned with organizations’ privacy programs, range from “data elements can be accessed for deletion” to “records of data disclosures are maintained and can be shared.”. A map between the subcategories in the NIST Privacy Framework and the privacy product categories from the Privacy Vendor Report with additional Governance and Technical and Organizational Measures can be used by the privacy technology market and the organizations to align privacy measures or technical solutions to the privacy controls or capabilities while better addressing privacy risks. Additionally, mapping of prioritized privacy product categories to each equal weighted subcategory in the NIST Privacy Framework can help organizations to make better decisions on the best way to accomplish needed privacy controls and capabilities and deal with privacy risks based on their risk appetite.

Next, I briefly present the Core functions of the NIST Privacy Framework including categories’ examples for each and general details on each privacy product categories or essential privacy measures, based on a prioritized map between the NIST Privacy Framework and the privacy measures and privacy product categories.

According to the NIST Privacy Framework, the following five Core functions should be performed concurrently and continuously to form or enhance an operational culture that addresses the dynamic nature of privacy risk.

·         Identify – Understand the business context, including the privacy interests of individuals affected, and legal/regulatory requirements. Prioritize efforts, consistent with risk management strategy and business needs. Examples of categories include: Inventory and Mapping, Business Environment, Governance, and Risk Assessment.

·         Protect – Implement safeguards that enable authorized data processing to be conducted in a protected state. Examples of categories include: Identity Management, Authentication, Access Control, Awareness and Training, Data Security; and Protected Processing.

·         Control – Enable data management, by organizations and individuals, with sufficient granularity to  manage privacy risks. Examples of categories include: Policies, Processes, and Procedures; and Data Management.

·         Inform – Enable organizations and individuals to have a reliable information about how data are processed to manage privacy risk effectively. Examples of categories include: Transparency Processes and Procedures, and Data Processing Awareness.

·         Respond – Implement appropriate activities to take action regarding a privacy breach or event.  Examples of categories include: Mitigation and Redress.


The following is a prioritized list of privacy product categories and essential privacy measures, with highlighted contributions to privacy controls and capabilities, aligned with the equal weighted subcategories in the NIST Privacy Framework.  

·         Data Governance is, among others, a privacy measure, meaning the exercise of authority and control in the organization over the management of data assets, through planning, supervision and control over data management and use while leading toward achieving goals. Established data governance controls contribute to many subcategories in the NIST Privacy Framework and are key to reporting on data privacy risks, manage regulatory requirements, privacy awareness education for stakeholders, authorizing data processing and responding to data breaches.

·         Technical and Organizational Measures (TOMs) help to ensure a level of security appropriate to the privacy risk through ongoing confidentiality, integrity, availability, access to and resilience of data processing and the personal data. TOMs include authentication, authorization, accounting, network traffic control, vulnerability management and data encryption, along with data minimization and retention; and privacy by design solutions.

·         Assessment Management solutions help with privacy impact assessments, locating and managing risks and demonstrating compliance. Those solutions enhance visibility into business context, regulatory requirements and privacy risks and supports with effort prioritization for risks remediation.

·         Data Mapping solutions allow building and managing asset inventories and mapping of personal data flows. Those solutions enhance visibility into data processing and therefor have a high positive impact on personal data protection and control.    

·         Incident Response solutions help with managing data breach response processes through workflows and information on data breach notification laws.

·         Data Subject Access Request solutions help with receiving and managing individual's requests for accessing, changing, correcting and deleting their personal data.   

·         Consent Management solutions support collection, tracking, demonstrating and managing individuals’ consent while allowing individuals' control on their communication preferences. Those solutions allow organizations to inform individuals on data processing and enhance their control on their personal data.   

·         Data Discovery solutions help to determine and classify , along with business context, what kind of personal data is possessed to help manage privacy risk and compliance.

·         Privacy Information Managers help to track information about data privacy regulations, laws and guidelines at a global scale in an effective and efficient way, while helping to achieve individual’s privacy interests and supporting with data privacy compliance management.  

·         De-Identification/Pseudonymity solutions help data scientists, researchers and other stakeholders derive value from datasets without compromising the privacy of individuals in a given dataset by generating distinct pairwise identifiers, with no identifying information about an individual, discourages individual activity tracking and profiling beyond the operational requirements established by an organization or by removing personally identifiable information from datasets.

·         Activity Monitoring solutions help to manage risks to personal data based on detailed information on how it is used and who and what can access it.

·         Data Breach Notification services help with a complete communication solution on data breaches to support affected individuals in managing their risks.

·         Website Scanning solutions scan and report on websites' cookies and other trackers and help to ensure compliance with cookie laws and regulations through tailored banners, preference center and cookie disclosures.

יום חמישי, 12 במאי 2016

Data Processors - Practices for achieving compliance with the General Data Protection Regulation

The new EU Privacy Regulation (GDPR) will come into force on 25 May 2018, the deadline appears far, but given the amount of changes to be implemented it seems too soon. Data processsors, which are also cloud service providers (CSPs) and which their service requires storing and processing EU citizens' personal data on a large scale, must be part of this new regime. Therefore, in business-to-business situations, CSPs, which are the data processors, must adapt their old data privacy controls and practices, which were related to the EU Data Directive, to the new requirements of the GDPR. Some practical actions that CSPs, which process personal data, can perform in order to achieve compliance with the new GDPR, are detailed in this post.

יום חמישי, 30 בינואר 2014

בניית אמון באבטחת שירותי ענן


השימוש בשירותי מחשוב בענן (cloud computing) מתרחב בשנים האחרונות באופן מהיר. השירותים מספקים יכולת גידול, דינמיות וצמצום עלויות הטמעה ותפעול IT של ארגונים. יחד עם זאת, שירותים אלו יוצרים אתגרים לא פשוטים בנושא חיסיון, אמינות וזמינות המידע ללקוחות. אלה האחרונים אמורים לתת אמון מלא במנגנוני האבטחה של ספק השירות אשר הינו האחראי הבלעדי לניהול תשתיות המחשוב ואבטחת המידע בענן ברוב המקרים.
רבים מהלקוחות, כמו אותו בן תם מההגדה של פסח, "אינם יודעים לשאול" את השאלות הנכונות בנושא מורכב זה.

קיימות שאלות פונקציונליות כגון: מי אחראי על המידע כשהוא שמור בענן ?
בבעלות מי ה – Meta Data הנוצר מעיבוד המידע ?
לאיזו רגולציה או חוק כפופים המידע השמור בשירות בענן ?

קיימות גם שאלות חוזיות כמו: איך מונעים מצב של Vendor Lock-in מול ספק ענן ?
האם ניתן לקיים תהליך בחינה של בקרות אבטחת מידע על השירות המסופק באמצעות גוף עצמאי ?

קיימות שאלות נוספות רבות, בין היתר: למי יש גישה למידע של הלקוח או למפתחות ההצפנה ?
מאיפה ניתן לקבל גישה ניהולית לתשתיות המחשוב של השירות ?
אילו בקרות קיימות בשירות למניעת זליגת מידע לגורם לא מורשה או ללקוח אחר של הספק ?
אילו מנגנוני דיווח קיימים בנוגע לתהליכים המבוצעים על ידי הספק בשירות ענן ? האם ניתן לקבל דיווח בנוגע לעמידה ברמת השירות של הספק ?

ספקי ענן שמעוניינים לשפר את האמון של לקוחות פוטנציאליים ברמת האבטחה של השירות שלהם חייבים לוודא שיש להם מנגנוני אבטחה מובנים ושהם מסוגלים לספק תשובות לשאלות אלו ורבות אחרות.

ארגון ה – Cloud Security Alliance הבינלאומי, שמקדם את אבטחת המידע בשירותי ענן בכל העולם, פרסם רשימות של שאלות והנחיות בנושא אבטחת שירות ענן. ניתן להשתמש בכלים אלו כדי ליצור ולפתח אמון בין לקוחות לספקי ענן בכל הקשור לאבטחת המידע.

צמצום פערים מול דרישות אבטחת מידע בשירותי ענן (בקישורים המופיעים למעלה) והעברת המידע המתאים ללקוחות הפוטנציאליים של השירות הם חיוניים על מנת להפוך את כולם ל"בנים חכמים" ולהגדיל את האמון הכולל ברמת האבטחה של שירותי הענן.




יום שישי, 27 בדצמבר 2013

Information splitting in Cloud Storage Services

Introduction

The use of cloud computing services is expanding rapidly in recent years as it enables scalability, quick adaptation to dynamic changes in business requirements and total cost of ownership reduction. However, these services create challenges regarding information confidentiality and availability, where the cloud service provider is solely responsible for managing the computing infrastructure and information security.

There are many ways to maintain information confidentiality, by using Steganography (messages hiding), privileges mechanisms or sophisticated encryption techniques. Nevertheless, in all those solutions the information is out there, stored in a cloud service provider repository, in one form or another, available for a determined attacker with the appropriate resources to find and expose it.

Beyond confidentiality, a major concern for organizations that stores their data in cloud services is degraded information availability due to hardware, software or communication failures. Although cloud services use multiple geographically dispersed data centers for enhanced service reliability, all those data centers belong to the same administrative domain and are susceptible to the same risks. One can of course encrypt and duplicate the information by using different cloud storage services (a typical data recovery procedure), but that method will increase the total cost of ownership, the number of attack vectors and also the overall information risk.

Encryption of information requires protection of the encryption keys and here is a dilemma: If these keys are stored in cloud storage service unauthorized access to the data is enabled by personnel within the service provider, which eliminates the need for encryption in the first place. If encryption keys remain in the user desktops, information availability will be lost in case of a local failure.

Information splitting - Secret Sharing

So where to store the encryption key or the data itself? One magical solution to these challenges is splitting and distributing the encrypted data to multiple private or public cloud storage services. This solution can reduce the dependency on the availability or reliability properties of a single cloud storage service or on applied data security controls.

One can randomly split the secret into several shares and store each share in a different cloud storage service, so that no single cloud service can recover the secret without the assistance of others. In order to recover the secret, all or a part of the different cloud storage services must collaborate and integrate their various shares into a restore function. That is, even if an attacker could get some random information stored in one cloud storage, he could not disclose the secret.

One simple and scalable method to implement data splitting is to perform a XOR operation on the secret number which you want to keep and on a random number, store the result in one place and the random number in other place (and do not forget to delete the original secret). To restore the number, a XOR operation must be done on the two stored numbers.

In the previous example if one share is lost or cannot be exposed (as in the case of a degraded availability of one cloud service) the information cannot be recovered at all. A more sophisticated method, which enhances data availability, enables recovery of data using combination of only a small group of distributed shares. For example, one splits the data into four shares, where only two of them should be integrated to restore the secret.

Professor Adi Shamir (one of the inventors of the RSA algorithm) proposed in 1979 a simple method to split secrets (secret sharing), which basically was based on graphs. Two points are sufficient to draw a straight line, three points are sufficient to draw a parabola, and so on. One can draw a line in any order (a polynomial) using random characteristics (except the secret one desires to protect), and distribute points (x and y values) on the line to every participant that one desire to share the secret with. Because each participant receives values of only one point, it has no knowledge of the secret itself. The polynomial order defines the number of shares that one will be required to combine together in order to recover the original secret. For example, one can draw a straight line with a slope defined randomly where the secret is the encounter of the line with the y-axis, then distribute four different point values on the line to four different participants, but only two of them, each possible pair of the four, will be required to combine their point values to recover the secret.

In general, this algorithm is called “threshold scheme (k, n)”. The secret is splited into n random shares with the size of the secret itself. Combination of k shares (a number less than n) or higher can allow the recovery of the secret. Combination of less than k shares does not allow disclosure of any information about the secret. Thus, the loss of any share does not degrade the secret availability.
                                             Figure 1 - secret splitting into shares using a parabola

This method is defined “unconditionally secure” because it is secured even if the attacker uses unbounded resources after it exposes one share. Disclosure of the sensitive information will require hacking into each one of the cloud storages in the group which keep the random secret shares.

Moreover, even if one cloud storage service is compromised and one secret share is disclosed, the method allows the creation of a new series of random secret shares which is distributed to the cloud storages that were not compromised, whiteout changing the secret itself, in a way that obviates the secret share that was exposed, which will be useless to the attacker.

Efficient information splitting

The secret sharing improves the confidentiality and availability of the secret data. The only disadvantage of using secret sharing algorithm is its low efficiency. Transferring and storing secret shares requires bandwidth and storage space which are equal to the product of the secret size and the number of shares. If one stores a large file, for example, 1GB, using 5 shares, than a storage size of 5GB will be required. Therefore, this method, which enhances confidentiality, is mostly suitable for small secrets (eg encryption keys), and less suitable for large secrets (such as files or databases).

There is also a method called "secret sharing made ​​short" (SSMS) which uses a three phases process: encryption of information, use of information dispersal algorithm (IDA - developed by Professor Michael Oser Rabin in 1989) which is designed to split the data using erasure coding in a very efficient manner and splitting also the encryption key itself using secret sharing algorithm. In this solution each cloud storage service locally stores a share of the encrypted data and a share of the encryption key. In SSMS the information encryption and secret sharing algorithm methods enable information confidentiality and the IDA improves both information availability and confidentiality in a very efficient way. The downside of the SSMS is that it is not immune against unbounded resource in the hands of an attacker, such as the secret sharing method. Nevertheless, in order to recover the secret one would have to penetrate a number of cloud storage services and recover both the encrypted information and the encryption key which is also splitted.

Another method that enables sharing of a secret in an efficient way while preserving confidentiality is the All-or-Nothing-Transform with Reed-Solomon (AONT-RS), which integrates the AONT that was proposed by Professor Ronald Rivest (another inventor of the RSA algorithm) and erasure coding. This method first encrypts and transforms the information and the encryption key into blocks in a way that the information cannot be recovered without using all the blocks, and second it uses the IDA to split the blocks into shares to be stored in multiple cloud storage services.
Figure 2 - information dispersal to multiple cloud storage services

Summary

Secret splitting mechanisms (split key, split knowledge) were mainly incorporated to protect sensitive information within organizations or securing the access to encryption keys. An example of this would be the use of dual control mechanisms or performing sensitive operations within a HSM. In recent years a number of registered patents and technological solutions use SSMS or ANOT-RS algorithms, which are based on secret splitting concepts, as a basis for efficient Information Assurance in cloud storage services. Those solutions are incorporated in desktops software or in cloud storage gateways and enable secure storage of data in multiple private and public cloud storage services. Secured information dispersal algorithms can effectively and efficiently improve the overall security of many services, such as big data storage, cloud data centers, data archiving, data backup and file synchronization. 

This arlticle was also published here.

יום שישי, 22 בנובמבר 2013

Mobile Security Solution for U.S. DoD Mobile Devices

Fixmo announced it was selected to provide mobile data protection and cyber security solutions for the U.S. DISA as part of a contract for managing mobile devices within the U.S. DoD. Fixmo’s technology will be used to secure mobile Email and Browsing, and to protect the integrity and compliance of Apple and Android DISA-managed mobile devices.

Fixmo’s technology can enforce the use of complex passcodes and allow users to digitally sign encrypted
emails and log into back-end systems using their Common Access Cards. 

Fixmo has been actively involved with DISA and mobile device manufacturers since 2010 in producing mobile security technologies that align with the requirements of DISA security policies and STIG definitions.

Some of the core technology developed by Fixmo evolved from its participation in the National Security Agency’s technology transfer program. Some of the intellectual property was originally developed by the NSA for use within DoD, specifically to enforce the security of BlackBerrys. Since then, DoD has moved to expand its variety of mobile devices.

Through its partnership with DMI, Fixmo will provide the following technologies as part of the DISA 
MDM/MAS contract award:

    Fixmo Secure Mobile Apps for Email, Calendaring, Contacts and Secure Browsing on smartphones and
    tablets.

    Fixmo SafeZone Secure Workspace for application containerization, multi-factor user authentication,
    FIPS 140-2 AES 256-bit encryption of data-at-rest and data-in-transit, and on-device data leakage
    prevention (DLP).

    Fixmo Enterprise Server for secure remote access to private DOD networks.

    Fixmo Sentinel Integrity Services for device continuous integrity verification, compromise detection and
    policy compliance reporting.

The MDM system is in the test phase, and it expected to reach initial operating capabilities in January 2014.

The DISA contract could cover as many as 300,000 DOD mobile users by 2016.

יום ראשון, 30 ביוני 2013

פיצול מידע בשירותי ענן

השימוש בשירותי מחשוב בענן (cloud computing) מתרחב בשנים האחרונות באופן מהיר, תוך יצירת אתגרים לא פשוטים בנושא חיסיון וזמינות המידע, כאשר ספק השירות הינו האחראי הבלעדי לניהול תשתיות המחשוב ואבטחת המידע. 
הצפנת המידע בשירותי ענן יוצרת מפתחות הצפנה שיש לנהל ופה קיימת דילמה: אם מפתחות אלו יאוחסנו בשירות כלשהו בענן, אזי תאופשר גישה למידע על ידי ספק השירות, דבר המייתר כשלעצמו את הצורך בהצפנה מלכתחילה. אם מפתחות ההצפנה יושארו בתחנות משתמשי השירות, זמינות המידע תפגע במקרה של כשל מקומי.

אז היכן כדאי לשים את מפתח ההצפנה או את המידע עצמו? אחד הפתרונות הקוסמים לאתגרים אלו הוא פיצול של המידע המוצפן, כולל מפתח ההצפנה עצמו, במספר שירותי אחסון בענן שונים. בצורה זו ניתן לצמצם את התלות בזמינות שירות בודד בענן או בבקרות אבטחת מידע המיושמות על ידי ספק השירות.

במאמרים שפרסמתי ובמצגת מקבילה אני מציג את השיטה החדשה - ישנה לפיצול מידע תוך שילובה בשירותי ענן. השילוב של אלגוריתם ה - Information Dispersal - IDA של פרופסור מיכאל עוזר רבין (המבוסס על שיטות erasure coding לאחסון מידע באופן יעיל) יחד עם אלגוריתמי ה - Secret Sharing של פרופסור עדי שמיר, או  All-or-Nothing Transform - AONT של פרופסור רון ריווסט, מאפשר לייצר פטנטים וטכנולוגיה המשפרים את חיסיון וזמינות המידע בשירותי אחסון בענן. מצורפים פה קישורים לשתיים מחברות רבות העוסקות בכך (כדוגמה בלבד לפתרון הטכנולוגי), יחד עם תיאורים כלליים של הפתרונות. 


תאור סכמטי 1 - טכנולוגיה מבית Cleversafe

תאור סכמטי 2 - טכנולוגיה מבית Cleversafe


תאור סכמטי 3 - טכנולוגיה מבית SecurityFirst






יום שישי, 19 באפריל 2013

US DoD is adopting Mobile Technology - status and challenges

The US Department of Defense (DoD) recently released its Commercial Mobile Device (CMD) Implementation Plan that will allow to equip the DOD’s 600,000 mobile-device users with secure classified and protected unclassified mobile solutions. 

This plan updates the DOD’s mobile strategy

The following video presents some of the Mobile Technology applications (and security challenges) in the US Army:


The following is a very interesting DoD press briefing on the CMD Implementation Plan (25 February 2013):




The DoD implements two separated working paths for accomplishement of the plan:

1. The Defense Information Systems Agency (DISA) released (October 2012) the Mobile Device Management (MDM) / Mobile Application Store (MAS) Request for Proposal (RFP). The MDM capability will function as a "traffic cop" enforcing policy for network and user end devices.

2. DISA’s mobility pilot started on May 2012 and builds enterprise mobile capabilities. The participants partner with DISA for the pilot’s unclassified side, while teaming up with the NSA to address the classified side of mobility. The following table lists several component mobility pilots and initial operational uses:


The goal - development of an enterprise mobile device management (MDM) capability and mobile application store (MAS) to support multivendor (Blackberry, iOS, Windows and Android), CAC-enabled, government-furnished devices by February 2014.

The scope - establishment of a separated, reliable, secure and flexible wireless infrastructure, for unclassified (DISA) and classified (NSA) devices, and mobile application.

The interesting news - a deployment plan of a new NSA security architecture that permits the use of commercial products for classified communications for the first time.


The Commercial Mobile Device Working Group (CMDWG) - will review and approve standards, policies, and processes for the management of mobility solutions and mobile applications on an ad-hoc basis.

The (several) callenges:
  • The transfer from decentralized to certralized MDM services.
  • The optional usage of commercial devices, MDM / MAS solutions and accreditable cloud solutions.
  • Federated management and certification for mobile applications.
  • For the device security compliance proceess DISA is using new Security Requirements Guides, a set of security standards that each device or application must comply with (instead of using the STIG process, which is relatevely long).
  • Continuous monitoring and enforcement of policy compliance for configuration of applications and OSs.
  • Secured authentication of mobile devices and users in unclassified networks.
  • Processing of classified information on commercial mobile infrastructure, devices and applications:  
    • Establishement of separated MDM / MAS infrastructure for classified information.
    • Encrypting information using a minimum of two independent layers of Suite B commercial encryption. 
    • Deployment of CMD architectures and implementations using NSA approved standards. 
    • Protection of voice communications on carrier infrastructure and also using gateways for interoperability with the PSTN. 
    • Use of secured hardware tokens for trusted user identification and authentication to SIPRNet.