Author: Yan strong
In the process of modern business development, data-driven business innovation is playing a vital role. The introduction of private data allows companies to more accurately identify potential customers, better serve target groups, and even open up new market spaces. .
- The road to the Internet, the blockchain is back
- Ripple has completed the final USD 20 million investment in MoneyGram, with a total investment of USD 50 million
- Perspectives | Incomplete Contracts and Blockchain Expansion
- 2019Q3 Blockchain Mining Research Report: Low-power mining machines will gradually establish advantages and the mining industry enters a new era
- Science | What is the valid balance of validators in Ethereum 2.0
- Let's talk about DAO from a human perspective
Of course, the blessing of evil lies. With each innovation, it is possible to break the usual norms for the use of private data, bringing additional privacy risks and adverse social impacts such as invasion of personal privacy space and leakage of sensitive corporate information. Relevant government departments, as the guardians of market order and social order, will set corresponding laws and regulations to regulate the necessary standards that enterprises should follow when conducting business innovation that relies on private data.
In recent years, as represented by the European Union's "General Data Protection Scheme" (referred to as GDPR), governments around the world have continuously refined the protection of privacy legislation and greatly strengthened their punishment. In China, within the existing legal framework represented by the "People's Republic of China Cyber Security Law", "Information Security Technology Personal Information Security Regulations", and "Personal Financial Information Protection Technical Regulations", there are also penalties for privacy violations. A fine is imposed, and a censor is imprisoned.
Just last Friday (March 6), personal privacy and security was protected at a more granular level at the national level. In 2020, the new national standard "Information Security Technology Personal Information Security Specifications" was officially released to collect, store, and use personal information. Made clear regulations and stipulated that personal information subjects have the power to query, correct, delete, withdraw authorization, cancel accounts, obtain copies of personal information, etc. At the same time, add "autonomous selection of multiple business functions" and "restrictions on the use of user portraits" "" Use of Personalized Display "," Third Party Access Management "and more.
How can business innovation effectively meet the strict requirements of privacy compliance? Here, we will share some thoughts on balancing privacy compliance risks and modern business development from the perspective of controlling compliance costs: how to identify privacy compliance risks, understand compliance requirements at different levels, and control compliance costs through technical means, And respond to new privacy compliance challenges that may arise during the expansion of business development.
Clear goals for privacy compliance
Due to the differences between corporate development stages and regional market laws and regulations, the primary task of effectively responding to privacy risks is to clarify the goals of privacy compliance.
We can observe that with the deepening of legislation, in recent years, the dispute over "what data is considered private data" has been decreasing. Although each region's laws and regulations have different definitions of privacy data, they provide specific types of definitions and sensitivity classifications, such as KYC identity data and financial data at the most sensitive level. This allows us to avoid the previously unclear issues of rights boundaries and to clarify the goals of privacy compliance.
For businesses operating in a region, the goals of privacy compliance can be summarized as:
Protect customers' legal rights by protecting the privacy data defined in current regional market laws and regulations, and providing corresponding features in product design.
The two sets of keywords extracted here-"data content protection" and "data rights protection" represent the two main lines of privacy compliance.
Next, we will describe the corresponding compliance needs around the nine dimensions of the two main lines.
Dismantling compliance requirements from nine dimensions
In view of the vast majority of successful business models in an information society, it is inevitable that they will rely on massive amounts of privacy data originating from a large number of customers. Traditional artificial governance methods have very limited efficiency, and the potential violation penalties are considerable. Therefore, we need to introduce technical means to meet the compliance needs from all dimensions.
The nine dimensions of compliance requirements are like the ninefold barrier of privacy compliance. For ordinary companies, focusing on the most basic dimensions can meet compliance needs. However, for companies in a highly regulated industry, such as fintech companies, or companies operating multinational information services, such as online social networks and cross-border e-commerce, they may need to meet compliance requirements in all dimensions.
How can we cross the border and finally achieve stable business development under the framework of legal compliance? Here, we will elaborate the relevant points one by one.
First dimension: interface data hiding
Hidden data in the user interface, so that when customers use the product, their privacy data cannot be seen by malicious third parties in nearby locations.
As one of the most easily satisfied requirements in data content protection compliance, direct interface rendering operations, such as simple display coding and data truncation, are all effective technical means.
However, it is also often one of the most vulnerable to privacy incidents due to neglect. Especially under the premise of displaying multiple sensitive data fields at the same time, if the hiding technology is not used properly, it may be equivalent to having no hiding effect.
Second dimension: network data hiding
Hiding data in the network dimension makes it impossible for malicious data to intercept clear data during transmission of private data.
Classic transport layer security TLS / SSL series protocols can meet this demand. However, it should be noted that the security of such protocols depends on the normal operation of trusted public key digital certificate services. Once the service is attacked, it may lead to certificate forgery and certificate expiration, which will eventually affect the security and availability of existing services. .
The third dimension: hidden data in the domain
Private data can only be decrypted into clear text in a secure isolated computing environment. Outside the secure isolated computing environment, only ciphertext operations can be performed and stored in media in the form of ciphertext. Trusted hardware or software isolation is used here to build a secure isolated computing environment. They rely on different security assumptions and need to be selected based on the characteristics of the business.
Fourth dimension: cross-domain computing data hiding
The plaintext of the private data only appears in the same computing domain. When performing joint calculations with other computing domains, the controllers of other computing domains cannot directly access or indirectly infer the plaintext of the private data to prevent other partners from obtaining authorization from the cooperation agreement. Sensitive privacy data.
As the most challenging requirement in data content protection compliance, it is particularly important for highly sensitive data businesses such as medical data and financial data. Failure to meet compliance requirements usually means that the business is unavailable or faces huge fines. Moreover, there may be two-way penalties, that is, companies will not only be penalized for leaking private data due to their own program vulnerabilities, but also be penalized for using unauthorized company's program vulnerabilities to obtain unauthorized sensitive private data.
In order to avoid related privacy compliance accidents, common technical solutions that can be used include data desensitization, secure multi-party computing, data outsourcing computing, zero-knowledge proofs, etc. In specific scenarios involving machine learning, emerging technologies such as federated computing can provide more effective solutions.
Fifth dimension: data access announcement
Data access notice refers to the details of the privacy data circulation life cycle, such as letting customers know what private data the current business collects, why it is needed, how it will be used, how it will be stored, and how long it will be kept. As the most basic requirement in data rights protection compliance, it guarantees the right of customers to know.
The difficulty in meeting this demand is how to make customers understand the obscure technical language and the consequences of related privacy risks, so as to prevent relevant regulatory agencies from confusing the customer's understanding as a reason for judging corporate violations.
Research on user experience and human-computer interaction technology is the key to handling this demand. Appropriate use of machine learning-based automatic risk matching is a relatively well-recognized technology in the industry in recent years, simplifying customers' understanding of costs and helping them more rationally assess the potential risks of their corresponding businesses.
Sixth dimension: data collection control
Data collection control refers to allowing customers to choose which private data will be collected by the business system, and after initial selection, allows adjustments to future data collection options. Because data collection is the starting point of the life cycle of private data circulation, this requirement can give customers global control over their own private data circulation. Regarding the privacy data that customers are unwilling to share, under the effect of the data collection control mechanism, they cannot enter the business system in an unauthorized manner to create a customer's psychological security. Traditional access control technology can achieve this requirement well, but if the original system architecture design is not scalable, the related historical system transformation will be a huge engineering challenge.
Seventh dimension: data usage control
Data usage control refers to allowing customers to adjust or restrict the use of private data in specific business systems.
Originally one of the GDPR's unique compliance requirements, it was called restricted processing rights. The latest version of the "Information Security Technology Personal Information Security Specification" also has relevant regulations, which are only valid for some types of business. At present, it is mainly aimed at personalized recommendation services related to online advertising, and the original intention is to avoid a strong intrusion of personal privacy space caused by too personalized recommendations.
Eighth Dimension: Derived Data Control
Derived data use control means that customers are allowed to have certain control over the derived data generated by their original privacy data after transformation and aggregation.
This is also one of the GDPR's unique compliance requirements, which are currently manifested in two main areas:
- Right to be forgotten: After the customer deletes the account, the corresponding individual historical data and aggregated data containing the customer are cleaned up;
- Data carrying right: The customer has the intention to leave the current business platform, and extract all relevant historical data, such as e-mail, comments, and cloud host data.
The realization of this demand also usually faces a high cost of system transformation. It is recommended that companies consider a comprehensive privacy data traceability mechanism early in the system architecture design to reduce compliance costs for later transformation.
Ninth dimension: data impact review
Data impact review refers to allowing customers to review business decisions based on their private data, thereby correcting unfair judgments that automated decision-making systems may make and eliminating negative effects such as data discrimination.
This is probably the most challenging requirement in entitlement data assurance compliance. Its focus is on the interpretability of data-driven decision-making system design and restricting the application of difficult-to-explain machine learning models in key areas such as people's livelihood and medical care. This requires companies to develop decision models with high interpretation capabilities or provide alternative technical solutions when designing automated decision-making system designs to reduce compliance costs caused by misjudgments.
It is: strong legislation and strict control, Xing technology companies coincide with regulations!
Increasingly detailed privacy protection laws and regulations have put forward continuous quantitative compliance requirements for massive amounts of privacy data. Only with the help of science and technology can it be possible to effectively achieve privacy compliance, thereby clearing up the obstacles to continuous business innovation and preparing enterprises for opening up international markets.
Therefore, starting from the next tweet in this series, we will start with the core technology area of privacy protection "cryptography" and gradually share with you in-depth analysis and theoretical analysis of key technologies. For more details, please pay attention Decomposed below.
about the author:
Yan Qiang, PhD in SMU Information Security, Winner of Best Paper Award at Top International Conference on Information Security; Once the only early core member of Google ’s privacy protection infrastructure technology department from China, he led the development of technical solutions in the Android and Google Play ecosystem The major portal products are fully integrated and put into production.