Meld. St. 22 (2020–2021)

Data as a resource— Meld. St. 22 (2020–2021) Report to the Storting (white paper)

To table of content

7 Fair, ethical and responsible use of data

The Government will facilitate a responsible data economy in Norway and work to ensure that data are used in fair, ethical and responsible ways. Security and privacy must be safeguarded. Consumers’ rights shall be protected and there should be fair competition rules for Norwegian and international businesses. While better use of data can lead to services being personalised and tailored to individuals, the work on preventing discrimination, manipulation and misuse of information will be particularly important. Transparency, equal treatment and legal certainty are important democratic principles and must also apply in the data economy.

7.1 Fair competition in the data economy

Many of the biggest companies in the data economy depend first and foremost on all the information users and customer leave behind when visiting websites, searching for information, downloading and using mobile apps or shopping online. People’s everyday use of technology generates large amounts of personal data, and enterprises with business models that are based on refining, exchanging and using such information are dependent on consumers’ trust.

7.1.1 The dominance of the big technology companies

The world’s biggest technology companies, measured by market value and market shares, are leading the way in technology development and the data-driven economy. There are currently five big technology companies dominating the global market: Alphabet (Google’s parent company), Facebook, Apple, Microsoft and Amazon. Large Chinese technology companies have also entered the global market, such as the internet company Alibaba, the social network TikTok, and Huawei, which delivers technology and equipment for telecommunications, mobile phones, etc. These businesses are fierce competitors in areas such as smartphones, operative systems, analytics technology and cloud computing services. Other areas are dominated by individual companies, such as Facebook in social media.

The competition from the big international platform companies primarily affects the service industries, such as banking, finance, insurance and the media, but no one industry is safe. The market dominance of the big technology companies challenges the ground rules and exposes Norwegian business and industry to fierce competition. Competition can be positive and lead to a high pace of innovation and low prices, but that is contingent on fair competition on as equal terms as possible. Network effects and tax issues are factors that challenge this.

The prohibition provisions in the Competition Act are worded in such a way as to render them applicable in new, emerging technologies, industries and markets. Nonetheless, there is reason to question whether the specific issues of competition law that can arise in a data-driven economy can be adequately resolved with today’s legislation. This is part of the reason why the European Commission is currently formulating competition rules that can effectively address new technology and new markets. The allocation letter to the Norwegian Competition Authority for 2021 specifies that the authority must continually assess whether it has the necessary enforcement tools to deal with a digital economy. In this connection, the Norwegian Competition Authority will follow up processes in regulatory development in both the EU and the OECD.

Network effects

Some markets are characterised by economies of scale and what are known as network effects. A network effect is when the benefit derived by one user of a product or service depends on how many other users use the same product or service. Telephony is a good example: who would be interested in buying the world’s only telephone? The value of telephony is dependent on how many others have a telephone. In a network effect, the market will often ‘take off’ once a given number of users is reached, creating a self-reinforcing demand that generates a large number of users.

Social media are another example. There is no fun in being on a social medium where no one else is. In 2009 ‘Nettby’ was Norway’s most popular online community. Nettby was closed down in December 2010 because all its users moved to other online communities, predominantly Facebook. This illustrates another aspect of network effect: if the cost to the user for switching to alternative services is low, ‘all’ the customers may switch when a new and seemingly more attractive service emerges. The providers will therefore try to create mechanisms that make it less attractive to their users to switch services. This may be the case if they find the transaction costs of switching too expensive.

Figure 7.1 Use of social media

Figure 7.1 Use of social media

Photo: Robin Worrall on Unsplash

Moreover, new businesses find it difficult to establish themselves in a market in which one or more dominant actors are already operating. A typical characteristic of digital services is that once a service is fully developed and established, the cost of serving one additional customer (the marginal cost) is extremely how, practically zero. This means that the service can be scaled up at a very low cost. The challenge for intruders in markets like these is to create a critical mass of services and users. Because establishment costs are high, the cost of each new customer during the establishment phase is extremely high.

The competition authorities serve to counteract network effects, attempting to restrict monopolies and thus ensure competition in the market. A relevant example is the requirements that were imposed by US authorities, and subsequently by the European Commission, on Microsoft to publish APIs for the Windows platform so that desktop programmes from other providers can open and process files such as Word documents. The Norwegian Consumer Council later took the lead to force Apple to remove its copy protection system (also known as digital rights management (DRM) software) which at the time allowed music purchased from Apple to only be played on Apple devices.

Apple and Google – through App Store and Google Play Store – have been criticised for exploiting their market dominance by making it difficult for competitors and for giving preferential treatment to their own products. For example, these companies have a payment model that requires other companies to pay a considerable share of their revenues to Apple and Goole for distributing their apps on the platforms. This generates higher distribution costs for Apple’s and Google’s competitors, as there are few alternatives for reaching consumers.

7.1.2 The Digital Services Act and the Digital Markets Act

In recent years the European Commission has been following the digital platform economy and platform companies such as Facebook, Amazon and Google. In December 2020 the European Commission published the Digital Services Act and the Digital Markets Act. These legislative initiatives have two main goals: to create a safer digital space in which the fundamental rights of all users of digital services are protected and to establish a level playing field to foster innovation, growth and competitiveness, both in the EU/EEA and globally.

Digital Services Act (DSA) can be viewed as a revision of the e-Commerce Directive. Since the e-Commerce Directive was adopted in 2000, the digital landscape has changed significantly with the emergence of new digital platforms. The DSA should improve the possibility to remove illegal content online and better protect users’ fundamental rights, including freedom of expression. The DSA should provide better public supervision of digital platforms, particularly platforms reaching more than 10 per cent of EU citizens.

More specifically, this means the possibility to flag and trace sellers of illegal goods, services or content online; better protection for users; algorithmic transparency, for example in services that come with recommendations; greater responsibility for the dominant platforms to prevent misuse of their services through risk management and internal control; access for researchers to key data from the very large platforms so that they can scrutinise how online risks evolve; and an oversight structure to address the complexity of the online space.

The aim of the Digital Markets Act (DMA) is to regulate and limit the market power of certain large platforms that control the online market to a large degree. In practice this means the companies collectively known as GAFAM: Google, Amazon, Facebook, Apple and Microsoft. This initiative entails prospective regulation (known as ex ante regulation) of large platforms. This means that obligations and requirements are imposed on these platforms without evidence of negative effects on competition. The European Commission is also expected to propose a new ‘competition toolbox’; this is, new tools that can be used on a case-by-case basis where competition problems arise. This may also include assessments to identify market failures in digital markets.

Norway supports the EU initiative to develop a new regulatory framework for digital platforms. In its response to the EU in connection with the consultation of DSA and DMA, Norway stresses the importance that the new framework does not lead to an excessive regulatory burden that disproportionately impacts European small and medium-sized enterprises. It also deems it important that necessary considerations concerning security and crime be considered. It is also Norway’s position that no obligations for digital platforms ought to be imposed that may result in content from editorial media being censored in ways that negatively impact freedom of expression. Furthermore, Norway recommends stronger protection of consumer rights on digital platforms and stresses the need for universal design.

7.1.3 Taxation of multinational companies

Another aspect of importance to the competition situation is taxation. If multinational companies can achieve lower tax rates than domestic companies by exploiting weaknesses in national and international regulations, this will have a negative impact on the competitive situation for Norwegian companies. The combination of globalisation and digitalisation has enabled business models that challenge national and international tax regulations. This makes it possible for multinational companies to reduce their tax liabilities through tax planning and to distribute their tax revenues between countries.

Between 2013 and 2015 the OECD and the G20 countries conducted a joint project called Base Erosion of Profit Shifting (BEPS). It was launched to address weaknesses in national and international tax regimes that create opportunities for multinational enterprises to, among other things, shift taxable profits from group companies in countries with high tax rates to group companies in countries with low or no tax. Over time, such profit shifting can make it difficult to retain tax revenues and can have negative competition effects for domestic businesses. This type of activity is associated with many industries and is not exclusively linked to digitalisation. Through the BEPS project consensus was reached on a number of initiatives to counteract profit shifting and tax base erosion.

Digitalisation of the economy is exacerbating challenges in the taxation of multinational enterprises and is creating new challenges beyond profit shifting alone.1 Under current international tax rules, physical presence is a key criterion for taxation, but digitalisation and new business models have created opportunities for multinational enterprises to have a market presence without a physical presence (properties or employees). This puts pressure on international rules regulating which countries can impose tax on these enterprises’ revenues.

Work is currently being conducted in the OECD/G20 Inclusive Framework, a collaborative body comprising more than 140 countries, including Norway, to find a collective, consensus-based solution for a more effective and fair tax system in an increasingly digitalised economy. The Inclusive Framework is working on a two-pillar approach:2

Pillar 1: Allocation of taxing rights to business profits between countries

This pillar will establish new rules on where tax should be paid (“nexus” rules) and a fundamentally new way of sharing taxing rights between countries. The aim is to ensure that digitally-intensive or consumer-facing MNEs pay taxes where they conduct sustained and significant business, even when they do not have a physical presence.

Pillar 2: Profit shifting and tax base erosion

This is a continuation of the BEPS project. The aim is to introduce a global minimum tax that would help countries around the world address remaining issues linked to Base Erosion and Profit Shifting by large multinational enterprises. Another aim is to counteract harmful tax competition between countries and a ‘race to the bottom’ on corporation tax.

Some European countries have argued that the EU ought to put in place a common European proposal for fairer taxation of digital services. In a joint statement on 14 October 2020, the G20 finance ministers expressed their continued support of the Inclusive Framework process. They support the call for continued efforts to reach a global and consensus-based solution. The Government will keep the Storting informed of developments in, and the outcome of, the negotiations process and will present proposals of whatever amendments to laws and international agreements would be necessary for the initiatives to be implemented.

7.2 Fair, ethical and responsible use of data

Use of data can help make services more personalised and better adapted to individuals. Many will find this a positive development. Nonetheless, the data economy has some negative aspects in the form of echo chambers, manipulation of information and misuse of personal data. The Government wants to promote a culture where business and industry see the value in developing ethically sound solutions, products and services.

7.2.1 Risk of manipulation, echo chambers and discrimination

Data analytics can reveal whether people are impulsive or cautious, whether they like to be the first to buy new products or whether they react best to hearing that an item is almost sold out.3 Knowledge of individuals’ personality traits makes it possible to adapt advertising aimed at individuals or consumer groups in ways that make it seem relevant and useful, but advertisers can also exploit consumers’ vulnerability by personalising advertising at points in time when individuals are most likely to be receptive. Such marketing not only challenges privacy and consumer protection; it raises broader ethical questions about the individual’s right to self-determination.

Risk of manipulation based on a combination of data analytics and behavioural psychology is not confined to advertising; analysis of personal data has also been used to manipulate individuals to adopt a specific political view. Cambridge Analytica’s use of personal data in various election campaigns, collected from Facebook, is an example of this. Without users’ consent or knowledge, analyses and profiles were developed to target individuals with political messages. This case shows that information that is obtained for one purpose can be exploited in new and unpredictable ways. The use of personal data to influence democratic elections blurs the line between privacy and consumer protection and fundamental civil rights, and can pose a threat to democracy.4

Business models that are based on personalised digital services can also impact the conditions for public discourse. Digital echo chambers are created when individuals are exposed to a disproportionate amount of views that reinforce their own. This has created favourable conditions for phenomena such as ‘fake news’ and ‘alternative facts’. Over time, this can undermine trust in the mainstream media. Echo chambers can also stifle open discourse, which is a key prerequisite for a free and democratic society.

The risk of discrimination and misguided conclusions

The use of profiling based on algorithms and artificial intelligence can produce better, more personalised services, but it can also increase the risk of unlawful differential treatment. When services are more personalised, consumers risk being subjected to price discrimination. Algorithmic advertising systems can also be designed so that, for example, advertisements for housing rentals and job vacancies are only shown to certain groups of people. This means that others are essentially excluded and potentially subjected to discrimination.

Enterprises that process personal data are obliged to ensure that the information is correct. This proves particularly challenging when analyses are based on data obtained from multiple sources. A flawed factual basis can in many contexts have negative consequences for individuals. Any errors in the datasets included in an analysis can lead to incorrect decisions. While the data may be correct seen in isolation, they can produce unfair and discriminatory results if, for example, an algorithm contains historical biases.

7.2.2 Ethics by design and ethical risk assessment

Requirements for risk assessments and privacy by design are laid down in the General Data Protection Regulation (GDPR). Privacy by design is a key requirement in GDPR, and means that consideration must be given to privacy in all phases of development of a system or solution. When sharing and using data for new purposes, the same consideration should be given to ethics by design and ethical risk assessments. Artificial intelligence is a central component of the data economy. For example, when using artificial intelligence, it is important to assess whether an algorithm could lead to discrimination and then implement measures to reduce that risk. Ethical assessments may also cover potential consequences for the environment and whether a system contributes to achieving the UN Sustainable Development Goals.

In the National strategy for artificial intelligence (2020), the Government promotes seven principles for responsible development of artificial intelligence:

  1. Solutions based on artificial intelligence must respect human autonomy and control.

  2. Systems based on artificial intelligence must be safe and technically robust.

  3. Artificial intelligence must take privacy into account.

  4. Decisions made by systems built on artificial intelligence must be traceable, explainable and transparent.

  5. The systems must facilitate inclusion, diversity and equal treatment.

  6. Artificial intelligence must be developed with consideration for society and the environment, and must have no adverse effects on institutions, democracy or society at large.

  7. Mechanisms must be introduced to ensure accountability for solutions based on artificial intelligence and for their outcomes, both before and after solutions are implemented.

7.3 Privacy challenges in the data economy

Personal data represent an important resource, and if used correctly can benefit both individuals and society. At the same time, privacy is a human right that his protected by the European Convention on Human Rights and the Constitution of the Kingdom of Norway. It is important that use of personal data take place within the confines of the law and of what is ethically defensible. Another prerequisite is to create trust so that individuals are willing to share information about themselves.

Overlap between data protection and consumer protection in the data economy

There is considerable overlap between data protection and consumer protection in the data economy. Data protection is challenged by data-driven business models by which consumers are offered products and services in return for disclosing personal data. Consumers are often unaware of how this information is used. Without this knowledge, consumers are not fully able to influence how their personal data are used or to take on the role of critical consumers in the data economy.

It is possible to protect consumers’ interests and still allow the private sector to create value from new technologies and data. The Government sees a need to strengthen the position of consumers and strike a better balance between businesses and consumers in the digital economy.

In 2019 the EU adopted a directive aimed at modernising consumer rights in light of the digital transformation: The Better Enforcement and Modernisation Directive.5 The new directive imposes more stringent information requirements on providers of digital services, search engines and online marketplaces. The directive strengthens consumer rights in agreements where the consumer pays for services in the form of personal data, and sets stricter sanctions. A violation that affects consumers in multiple member states may result in a maximum fine of at least four per cent of the trader’s annual turnover in the member states concerned. The Ministry of Children and Families is currently working on incorporating the directive into Norwegian law.

The EU has also adopted a directive with the purpose of harmonising the rules on delivering digital services to consumers.6 The directive applies not only where the method of payment is money, but also where the consumer provides personal data, unless the personal data is processed exclusively to supply the digital service or to meet legal requirements. In December 2020 the Ministry of Justice and Public Security distributed draft legislation for implementing the Digital Services Act for consultation.

The Government will present a national strategy for a safe digital childhood. The aim of the strategy is to develop a cohesive policy for children’s digital lives. The strategy will discuss the positive aspects and the risks of children’s use of online resources. Important topics will be consumer rights, data protection, cyber security, online marketing, digital safety for children, digital skills and e-commerce. The Norwegian Media Authority will be responsible for the practical coordination of the work.

7.3.1 Complicated end user agreements and privacy statements

In the data economy, personal data are often shared and used for secondary purposes. Such processing can challenge the principle of purpose limitation. The principle states that the purpose for processing personal data must be clearly stated and established when the data is collected. This is fundamental to ensuring that individuals have control of their data and can give informed consent to data processing. In accordance with the principle, personal data may not be used for new purposes that are incompatible with the initial purpose unless this is based on consent or statutory provision.78

Consent from individuals to process their personal data can allow development of innovative products and services and facilitate more personalised digital solutions. Consent must be freely given, specific, informed, and be given by a clear affirmative act.9 These requirements are intended to ensure that individuals receive good information about how their personal data will be processed and can exercise control over their use. In practice, it can be difficult to understand the terms and conditions for the use of apps and digital services, making it almost impossible for individuals to have real control and influence over what personal data are collected and how they are used. According to the GDPR, the privacy statement must be clearly separated from other terms of use, and the information must be given in language that is clear, plain and easy to understand. In practice, however, these statements are often long, complicated, detailed and full of legal and technical terms that are extremely difficult for individuals to interpret.

The companies that collect data will often have a far more detailed understanding of the data collection and how the data are used than the individual consumers. This phenomenon is known as information asymmetry. Enterprises that use advanced methods of data collection and profiling could achieve a competitive advantage. Consumers’ lack of understanding means that they often have no real possibility to assess privacy in products and services and reject those who use invasive methods. Such information asymmetry can hamper development of balanced solutions and prevent privacy-friendly solutions from becoming a competitive advantage.

Trading of personal data in the digital advertising market

Trading of personal data in the digital advertising market, combined with new technology that enables analysis of large datasets, has changed the way in which advertisers reach consumers. Personalised marketing has far more impact and accuracy than conventional marketing techniques. Although personalised advertising is often advantageous for consumers, the information can also be used to manipulate individuals into buying products they otherwise would not have bought. Data are used to analyse and identify people’s vulnerabilities and personality traits.

In January 2020 the Norwegian Consumer Council presented a report entitled Out of control. It describes how large amounts of personal data collected via various apps are resold. A digital twin is created of individual consumers, making it possible to follow them across services and platforms. This information gives commercial interests a detailed picture of individual consumers, such as information about their activities, preferences, purchases and health. These digital twins are sold on digital advertising exchanges where advertisers submit bids indicating how much they are willing to pay to have their advertisement shown to consumers with a certain profile. The type of profile considered attractive depends on what is being sold. The resale of personal data is often regulated in long and complicated terms of use. To make an informed consent, consumers will often have to read and understand terms of use not only of the party they enter into an agreement with, but also of third-party service providers. This makes it almost impossible for regular consumers to know who receives the data and how they are used.

7.3.2 Individual control over personal data

A fundamental privacy principle is the idea that individuals should have as much control as possible over their personal data. Because real control requires individuals to have knowledge of how their personal data are and will be used, entities that process personal data are obliged to provide this information in an intelligible and accessible form.

The right to data portability

One way in which individuals can exercise control over their own data is by requesting to receive them from the service provider. This is known as the right to data portability. This right applies when an individual, typically a consumer, has provided data based on consent or an agreement. Data portability means that individuals can obtain information about themselves in a commonly used and machine-readable format. The intention is that the user should be able to reuse the data across different systems and services.

This right strengthens not only privacy, but also consumer power. When consumers can transfer their personal data to a provider offering the best terms, this may encourage competition in providing privacy-friendly and secure solutions.

The European Commission has conducted an evaluation of the GDPR.10 The evaluation shows that the potential of the right to data portability is still not fully realised. One reason for this is the lack of standards enabling the provision of data in a machine-readable format. The Commission will explore practical means to facilitate increased use of the right to portability. Tools for managing user consent, standardised formats and interfaces may contribute to resolving this challenge.

Access strengthens individuals’ control over their personal data

The Government will consider establishing a solution where citizens can have access to their personal data in multiple public-sector systems via a common login system. The possibility to integrate selected enterprises from the private sector into such a system will also be considered. Such a system would make it easier for individuals to obtain a more meaningful overview of how their personal data are used. If possible, the solution should also make it possible to manage how the data are used, including giving or refusing consent to sharing them.

7.3.3 Enhancing knowledge by providing guidance on the data protection rules

The GDPR is an important instrument for promoting data sharing in a secure and responsible manner and for creating trust. Reports indicate that companies spend many resources on understanding and interpreting the provisions in the GDPR.11 The perception of the regulation as complicated and the fear of making mistakes and breaching the rules may therefore create unnecessary barriers for companies wishing to try out new ideas through data-driven innovation.

The Government sees a need to enhance knowledge about the data protection rules in the public and private sectors and in the population in general. Sound knowledge of the rules is a prerequisite for ensuring that those who process personal data fulfil their obligations and that individuals can exercise their rights.

The Norwegian Data Protection Authority and the Consumer Authority provide guidance to individuals, businesses and public bodies. For example, these authorities answer specific questions, publish guidance material and give lectures. In 2020 the Norwegian Data Protection Authority and the Consumer Authority published a guide entitled Digital services and consumer data, targeting developers, marketers and digital service providers.

It is important that the scope for action in the GDPR is used in ways that do not create unnecessary barriers to innovation. A sound understanding of the rules is a prerequisite for creating innovative and balanced solutions that safeguard privacy. The Government wants to contribute to the development of these types of solutions. Consequently, an important measure in the National strategy for artificial intelligence was to establish a regulatory sandbox for data protection and artificial intelligence. The sandbox was launched in 2020.

The overarching objective of the regulatory sandbox is to stimulate innovation of ethical and responsible artificial intelligence. The Norwegian Data Protection Authority will provide free and professional guidance to selected projects under the sandbox framework. The projects that are included in the sandbox must be innovative, which means that there will be uncertainty around how the data protection rules will be complied with. Together with the participating organisations, the Norwegian Data Protection Authority will identify regulatory issues that are challenging, and work towards sound, well-balanced solutions that safeguard privacy. The regulatory sandbox will fulfil several objectives:

  • The participating organisations will gain a better understanding of the regulatory requirements and thereby reduce the time from development and testing to rollout of artificial intelligence solutions in the market.

  • Solutions that are rolled out after being developed in the sandbox can set examples for other organisations seeking to develop similar solutions.

  • The Norwegian Data Protection Authority will gain a better understanding of new technological solutions and be able to identify potential risks and problems more easily at an early stage. This will enable the timely production of relevant guidance to clarify how the rules should be applied.

  • The Norwegian Data Protection Authority and the industries can identify sectors with a need for industry standards.

  • Individuals and society will benefit from new and innovative solutions being developed within responsible parameters.

7.3.4 Use of anonymous, de-identified and synthetic data

It is possible to leverage the potential of personal data without compromising privacy by using anonymised or de-identified data. Statistics Norway’s microdata.no platform is an example of this.

Use of anonymous data offers numerous possibilities for developing innovative products and services. Anonymous data are not personal data because the information cannot identify individuals. Anonymisation entails rendering it impossible to re-establish the link between the data and the specific individual, taking account of all the means reasonably likely to be used to identify the individual concerned. The data protection rules therefore do not apply because such information does not relate to identifiable individuals. Anonymous data can therefore be shared and used for new purposes.

Clarifying whether a dataset contains personal data, and the anonymisation process itself, can prove resource-intensive. That said, the benefits to be derived from anonymisation can more than outweigh the costs.

If anonymisation proves unsuitable or too demanding, de-identification may be a better option. De-identification entails the removal of all uniquely identifiable characteristics from the data. This can be useful and necessary for safeguarding personal data protection, but does not mean that the data are rendered anonymous. Use of de-identified data must therefore always comply with the personal data protection rules. Pseudonymisation is a form of de-identification where directly identifiable parameters (such as names) are replaced with pseudonyms, such as serial numbers. It must be impossible for such information to be attributed to a specific person without the use of additional information.

Figure 7.2 Anonymous data

Figure 7.2 Anonymous data

Photo: Chris Yang on Unsplash

Risk of re-identification

Re-identification occurs when individuals are identified based on de-identified or apparently anonymous data, often as a result of integrating data from multiple sources. Extensive access to data, combined with better and cheaper analytics technology, has increased the risk of such re-identification. The risk can be reduced by allowing only anonymous data to be used in analyses. However, it is not always easy to assess whether a dataset has been anonymised or only de-identified. It can also be difficult to assess whether connecting it with other datasets that are currently available or that will be available in future may lead to re-identification.

In order to safeguard trust and security, it is important to minimise the risk of re-identification. Enterprises must undertake thorough risk assessments when anonymising personal data and when integrating them with other datasets. Should the data prove to be identifiable, they must be processed in compliance with the provisions in the GDPR.

Textbox 7.1 Smartphone location tracking

Some data types, such as location data, are more difficult to anonymise than others. A person’s movements are often so unique that they can prove highly revealing. In the spring of 2020, the Norwegian Broadcasting Corporation (NRK) published a number of articles about the buying and selling of location data collected from free apps. After paying NOK 35,000, NRK bought what were apparently anonymised location data on more than 140,000 unique smartphones and tablets belonging to Norwegian citizens from the British company Tamoco. All the coordinates were linked to a date, a time and a specific device, and showed the exact location of a given device at a specific point in time. Using simple methods, NRK managed to identify several individuals and track their movements over time. The location data revealed where they lived and worked as well as information on stays in hospitals and crisis centres. Among the individuals NRK managed to identify were a member of parliament and key figures in the Norwegian Armed Forces.

Source Furuly, Trude et al. (2020): Avslørt av mobilen [Exposed by the mobile]. Published on nrk.no on 9 May 2020

Synthetic data

An alternative to using de-identified or anonymised data is to use synthetic data. A synthetic dataset has the same properties as a real-world dataset. Because the data do not pertain to real-life people, they do not involve personal data. Synthetic data have many applications and are often used as test data in system development projects where the alternative would be to test systems using real-world data. Since they contain no personal data, such datasets can also be made publicly available for purposes such as research and development.

Textbox 7.2 Generation of synthetic test data for the National Registry

The Norwegian Tax Administration has established a solution in which machine learning is used to generate rich synthetic test data in a dedicated test environment for the National Registry. The synthetic National Registry offers synthetic test subjects and simulates events. The objective is to allow enterprises that use information from the National Registry to test their integrations without using authentic personal data in the tests. These organisations include companies developing software for the public sector.

The synthetic data are available to anyone wishing to test integration with the National Registry or that need National Registry data for test purposes.

Source The Norwegian Tax Administration

7.3.5 Privacy as a competitive advantage

The Norwegian Data Protection Authority regularly undertakes a large-scale survey on privacy trends in Norway. The privacy survey for 2019–2020 reveals scepticism among respondents towards data-driven business models. Around half of respondents say they feel uncertain about whether smart house technology safeguards privacy. Three out of four respondents are negative to the use of personal data to personalise advertisements.12

Extensive commercial use of personal data could lead to a loss of trust in businesses and growing reluctance to buy digital products and services. The Norwegian Data Protection Authority’s survey shows that more than 50 per cent of respondents have declined to use a service because they were unsure of how personal data were handled.

On the other hand, trust in businesses is enhanced if consumers feel confident that privacy and information security are safeguarded, and that the data are processed within responsible parameters. Trust and privacy can thus represent a competitive advantage for companies that can show that they process data in a lawful, responsible and ethical manner.

Code of conduct for processing personal data

One of the challenges in the GDPR is the discretionary nature of its provisions, whereas many organisations, particularly small and medium-sized enterprises, need concrete and industry-specific guidelines. Codes of conduct (also known as industry standards) are intended to remedy this, and help organisations find clear answers to practical questions. The GDPR leaves it up to the industries themselves to develop codes of conduct for processing personal data and to have these approved either by the Norwegian Data Protection Authority in the case of national standards, or at EU level. The codes do not need to be exhaustive; they can be thematically delimited.

The Government finds it encouraging to see that industries are preparing codes of conduct for processing personal data. If companies within a sector agree to use the same privacy standards, this can contribute to reducing competition bias. At the same time, establishment and use of codes of conduct will lead to greater transparency and stronger privacy protection in the industries concerned. The fact that businesses show that they comply with a code of conduct approved by the Norwegian Data Protection Authority will also reassure consumers when selecting products and services that safeguard privacy.

Data protection certification mechanism

Pursuant to the GDPR, products and services can be certified according to specific criteria and be issued with a data protection seal or trustmark. A data protection seal would make it easier for consumers to assess whether privacy is safeguarded and whether the provider is trustworthy. This strengthens consumer power and can help make data protection a competitive advantage. Certification would be carried out either nationally or within the EEA, the latter of which may result in a common European data protection seal. The Norwegian Data Protection Authority is working on establishing a certification mechanism for Norway.

7.3.6 Enforcement and cooperation on data protection across Europe

The GDPR seeks to harmonise laws across the EEA. Citizens in all the member states will be assured the same strong data protection while businesses must comply with the same rules and administrative systems. This means there will be limited scope to establish separate data protection regulations for Norway. The most important work in following up the directive is therefore to ensure effective enforcement and interpretation of the rules.

Cooperation within the European Data Protection Board will be critical to achieving a common European understanding of the GDPR. One of the board’s main tasks is to provide guidance on how the GDPR should be interpreted and applied. Participation in the work on common European data protection issues offers possibilities to influence how the rules are interpreted. The Norwegian Data Protection Authority has therefore decided to actively contribute to the board’s work and has assumed leading roles in a number of large projects.

The GDPR lets the data protection authorities in the EEA member states impose administrative fines when the rules are infringed. Fines of up to EUR 20 million can be imposed, or – in the case of companies – four per cent of their worldwide revenue from the preceding financial year. The Norwegian Data Protection Authority also has alternative enforcement mechanisms, such as the authority to prohibit the unlawful processing of personal data and to impose coercive fines for non-compliance with orders. These mechanisms have strong potential to ensure regulatory compliance.

Cases of cross-border protection of personal data

Cases of cross-border protection of personal data are cases where an organisation’s processing of personal data affects individuals in more than one EEA member state or where processing takes place because the organisation is established in multiple EEA member states. Since data protection challenges in the data economy exist in an international context, it is particularly important that infringements of the rules in such cases be sanctioned.

Processing of cross-border cases are led by the data protection authority in the member state in which the company has its main establishment. This is known as the one-stop-shop mechanism. This mechanism is important for ensuring a harmonised interpretation of the rules, and is practical because organisations and citizens only need to deal with a single data protection authority inside the EEA.

A number of cross-border cases have been decided through this collaborative mechanism since the GDPR entered into force. However, the European Commission’s evaluation of the GDPR shows that enforcement in such cases has its challenges.13 Processing takes time due to multiple data protection authorities being involved, and because of the often complicated and comprehensive nature of the cases.

Several of the large international IT companies, such as Facebook, Google and Twitter, have their head offices in Ireland. Ireland’s Data Protection Commission is therefore currently the lead authority for several important cross-border cases, and its decisions may significantly influence how the rules are interpreted. Despite the Irish Data Protection Commission being one of the supervisory authorities with the largest staff increases after the GDPR was introduced, there are problems with heavy workloads and long processing times.

The European Commission’s evaluation also shows that the supervisory authorities have not yet made full use of the cooperation mechanisms provided by the GDPR, such as joint operations and enforcement actions. The Commission reports that it will continue to foster more efficient and harmonised handling of cross-border cases.

7.4 Cyber security

Good digital security is fundamental to the digital economy and the data economy. Society is becoming increasingly vulnerable to cyberthreats, and the more data collected, stored and processed, the greater the exposure to vulnerabilities.

In January 2019 the Government published the National Cyber Security Strategy for Norway and a national strategy for cyber security competence (Nasjonal strategi for digital sikkerhetskompetanse). Norwegian companies’ ability to digitalise in a secure and trustworthy manner and be able to protect themselves against cyber security incidents are important strategic goals.

7.4.1 An increasingly complex risk situation

Data that are managed by public and private enterprises represent immense value. At the same time, the digital value chain and the composition of the organisations involved are complex, and often include both domestic and foreign entities. It is impossible to have oversight of every potential security challenge, so sound competence in cyber security is needed to ensure that new developments can be monitored and that any measures implemented are adapted to the risk situation.

According to the Norwegian National Security Authority (NSM), state-sponsored intelligence and criminals constitute the greatest cyberthreats to Norway. NSM is observing a steady stream of cyberattacks on Norwegian targets, including on organisations that perform critical societal functions.14

7.4.2 Information security and data sharing

Information security includes ensuring the confidentiality, integrity and availability of information that is processed and exchanged digitally.

Some data will need to be protected and kept secret. Datasets containing such information cannot be made openly available, and must only be shared subject to agreement and when both parties are sure that they have a legal basis to exchange and use the information. Transmissions of data must be appropriately secured, and the recipient must safeguard confidentiality. Encryption may be an appropriate measure to ensure this.

Integrity assurance is always important, also when sharing data. Procedures and systems must be established for data exchange that ensure that the information received is the same as the information transmitted, and that no changes occurred – either deliberate or due to error – during transmission. When no adequate procedures and systems are in place to ensure integrity during transmission, this must be clearly communicated to the recipient.

Availability has to do with having access to data when they are needed. If an organisation needs to use data in order to deliver a service or perform a task, the type of information must be identified, as well as any need for protecting confidentiality, integrity and availability.

Considerations of confidentiality, integrity and availability must often be weighed against each other. For example, high-level confidentiality may render data less readily available, in which case an assessment must be made of what is more important in specific situations.

7.5 Enforcement and supervision

The Government wants to have an effective system for supervision that is adapted to societal challenges and to needs in data protection, competition policy and consumer protection.

The Norwegian Data Protection Authority’s budget has been significantly increased in recent years. This is because the authority was assigned new tasks when the GDPR was incorporated into Norwegian law in the summer of 2018. The budget increase is particularly intended to make the authority well equipped to participate in EU cooperation on the protection of personal data.

The Government’s objective is to strengthen supervision of and guidance on consumer protection rules in the digital economy. The budget allocations for the Consumer Authority were therefore increased in 2019 and 2020. This shall also contribute to the strengthening of the Consumer Authority’s international activities, including coordination of cross-border supervisory activities.

Competition in a digital economy will be one of the Norwegian Competition Authority’s focus areas moving forward.15 The Norwegian Competition Authority will make it easier for digitalisation to contribute to increased competition and thereby to efficient use of public resources. The Norwegian Competition Authority will also assess the possibilities created by digitalisation to detect competition law offences and improve the efficiency of investigative methods.

Cooperation between supervisory authorities in Norway

The data economy raises issues that cut across various sectoral legislation, particularly data protection, consumer protection and competition legislation. It is therefore important that the relevant supervisory authorities cooperate and exchange knowledge and information, and participate in relevant national and international fora. The Norwegian Data Protection Authority and the Consumer Authority have established good cooperation on consumer and data protection issues in recent years. It will be important to reinforce this cooperation moving forward.

In the European context, the European Data Protection Supervisor has established the Digital Clearinghouse initiative, where supervisory authorities for data protection, consumer protection and competition discuss how different regulatory regimes can be viewed in relation to each other to ensure the functioning of the digital economy. In a white paper to the Storting on consumer policy, the Government announced that it will create a similar cooperation forum at national level: Digital Clearing House Norway.16 The Consumer Authority has been charged with establishing the forum. The purpose is, among other things, to achieve more effective enforcement, avoid duplication of efforts and ensure a cohesive approach.

7.6 The Government will

The Government will

  • encourage the creation and adoption of mechanisms for data protection certification

  • encourage development and use of codes of conduct (industry standards) for data protection

  • consider creating regulatory sandboxes in areas that are relevant for development of the data economy and data-driven innovation

  • evaluate the possibility of establishing a digital solution that comprise multiple data controllers, where citizen can have access to and possibly administer the use of their personal data, including giving consent to sharing

  • encourage public and private enterprises to develop solutions that simplify individuals’ access to information on and control over how their personal data are processed and, where applicable, shared

  • enhance knowledge about data protection rules among consumers and businesses

  • establish a national cooperation forum to strengthen supervision of the digital activities, modelled on the EU’s Digital Clearinghouse

Footnotes

1.

OECD (2015): Addressing the Tax Challenges of the Digital Economy, Action 1 – 2015 Final Report, OECD/G20 Base Erosion and Profit Shifting Project, OECD Publishing, Paris

2.

Updated information from the project can be found at www.oecd.org/tax/beps/

3.

Datatilsynet (2015): The great data race. Report on how commercial use of personal data is challenging privacy. November 2015

4.

Forbrukerrådet (2020): Out of Control. How consumers are exploited by the online advertising industry – and what we are doing to make it stop

5.

Directive (EU) 2019/2161

6.

Directive EU 2019/770. The directive was negotiated as part of a ‘package’ together with the new Directive (EU) 2019/771 on the sale of goods to consumers

7.

General Data Protection Regulation, Article 6(4) lists factors that must be taken into account when ascertaining whether or not the new purpose is compatible. Compatible reuse is generally permitted

8.

General Data Protection Regulation, Article 6(4); cf. Article 23(1) and Prop. 56 LS (2017–2018). If the basis for processing is laid down by law, the use must be a necessary and proportionate measure in a democratic society to safeguard specific important interests

9.

General Data Protection Regulation, Articles 4 and 7. Consent must be given more explicitly in some cases such as when it applies to special categories of personal data. The directive provides specific rules governing children’s consent in relation to digital services

10.

Communication from the Commission to the European Parliament and the Council. Data protection as a pillar of citizens’ empowerment and the EU’s approach to the digital transition – two years of application of the General Data Protection Regulation. COM/2020/264 final

11.

Communication from the Commission to the European Parliament and the Council. Data protection as a pillar of citizens’ empowerment and the EU’s approach to the digital transition – two years of application of the General Data Protection Regulation. COM/2020/264 final

12.

Datatilsynet (2020): Personvernundersøkelsen 2019/2020 [The Privacy Survey]

13.

Communication from the Commission to the European Parliament and the Council. Data protection as a pillar of citizens’ empowerment and the EU’s approach to the digital transition – two years of application of the General Data Protection Regulation. COM/2020/264 final

14.

NSM (2020): Helhetlig digitalt risikobilde 2020 [Overall digital risk situation].

15.

Nærings- og fiskeridepartementet (2020): Konkurransetilsynet (KT) – Tildelingsbrev [Norwegian Competition Authority – Allocation letter]

16.

Meld. St. 25 (2018–2019) Framtidas forbrukar – grøn, smart og digital [The consumer of the future – green, smart and digital]

To front page