Opening Remarks by Commissioner, Mr Lew Chuen Hong, at International Association of Privacy Professionals (IAPP) Asia Privacy Forum 2023, on 19 July 2023

19 Jul 2023

1. A very good morning. It is a real pleasure to be here at the IAPP Asia Privacy Forum 2023.

2. I would like to extend a very warm welcome to all our distinguished guests, and my counterparts from ASEAN, rest of Asia Pacific, and the Middle East.

Growth of digital economy and importance of data

3.  2023 marks the third year since the onset of COVID-19 pandemic. We see travel restrictions easing, and most economic and social activities have resumed, restoring a sense of normalcy for many.

4. As global economies shape their paths to recovery, one of the lasting legacies of COVID-19 has been the greater adoption of digital technologies and how that is at the core of what a lot of companies are thinking and doing. Indeed, the rise of the digital economy continues to power and remains a bright spot today. South East Asia’s digital economy in 2022 was estimated at US 200 billion – twice its size just 3 years ago. Data volumes and data flows have also increased correspondingly. In fact, some of these estimates indicate that 90% of the world’s data today has been generated in the last 2 years!

5. One analogy often used is data like the new oil – I actually don’t agree with that. Oil gets consumed and disappears and I think data is more like capital. The more the capital, the more the data is shared, the more it is re-used (it doesn’t disappear), and it generates value. For those who are economists amongst us, you know that in financial terms, there is the idea of the “velocity of capital”. In my mind, I am very clear the velocity of data drives the digital economy and that really underpins the efficient functioning of global supply chains – from the placement of orders, to payment details, transport of goods across distribution networks to the last mile delivery to our doorsteps.

6. I think the role of all of us here today is not just as data protection professionals but professionals who are supporting the growth of the digital economy. While there are clear opportunities accompanying greater use and sharing of data for consumers, businesses and regulators alike, we are also increasingly wary of the risks and potential for abuse by bad actors – all of which undermine trust. I want to place emphasis on the word “trust”.

a. Consumers need a trusted ecosystem that the new services and technologies they interact with are secure, and their data will be used responsibly.

b. Businesses have a huge role to play and are key stakeholders to build and maintain trust, not because it is a regulatory function but because it is core to enable use of data for innovation so as to keep pace with changing consumer expectations.

c. The role of regulators is to provide guardrails and a positive environment to build that trust. Only then can we address the risks while facilitating innovation.

What have been some of the concerns that have appeared on the horizon? Three broad areas of concerns involving data use.

7. First, Artificial Intelligence – more specifically, generative AI. Some say, Gen AI is actually a 70-year overnight success. AI as a topic has been around since the era of Alan Turing and it was only recently - with the mass availability of data and compute, and with the new architecture that has allowed what we see today called ChatGPT and Gen AI - that has literally exploded on the scene in the past few months. There is huge potential from end-user use all the way to deep professional areas. At the same time, there is a deep visceral sense that developments in tech are running ahead very quickly and potentially out of our grasp. Technology at the forefront is in fact leapfrogging the ability of a lot of regulations to deal with the consequences.

a. Recent incidents underscore the risks of Gen AI – from incorrect citation of non-existent court cases as part of legal research, in what is termed “hallucination”, to creation of deepfakes used in scams and for spreading disinformation. Regulators around the world are also struggling with the use of huge volumes of data for training of AI.

i. In May, the Italian data protection authority imposed a ban on ChatGPT. Reasons include the lack of legal basis under the EU GDPR for massive collection and use of personal data, the lack of age restrictions, and the potential for ChatGPT to provide factually incorrect information in its responses.

ii. The EU is looking to introduce a new AI Act, while Japan and UK have developed or are in the midst of developing guidance on key issues of concern instead of “hard regulations”.

8. Singapore has chosen to take a balanced approach to meet the twin goals of data protection and risk mitigation, while allowing market innovation to take place.

a. We first launched the Model AI Governance Framework in 2019, setting out key issues and guiding principles such as transparency, explainability and fairness to promote trust in AI. Beyond these broad principles, we wanted to give the community more concrete tools to deal algorithmically with how to measure some of these principles.

b. Based on these internationally-aligned principles, we developed the AI Verify Minimum Viable Product (MVP) last year. I would like to stress that it is an MVP because it is still at very nascent stages and far from the forefront of where AI development is. We thought it was a good starting framework to promote transparency on how companies are using their AI models and therefore giving that information to consumers. The toolkit tests for fairness, explainability and robustness, and we have actually open sourced the code under AI Verify Foundation so the world can contribute to building on AI Verify as a tool.

c. We also recognise that Gen AI is a completely different ball game. As a starting point to address the risks posed by Gen AI, we published a discussion paper that tries to lay out how we think of risks and policy considerations, governance and testing frameworks and the need for a practical, trust-based approach. For example, (i) defining accountability between model developers and businesses developing services on top of these models (ii) developing evaluation metrics for Gen AI models and (iii) clarifying the use of data for model training.

d. In relation to data use for training of Gen AI, the PDPC is also studying these issues and considering whether further guidance should be provided under the the PDPA, recognising that Gen AI has unique concerns, such as “memorisation” and “regurgitation” of personal data used to train it wholesale.

e. Before we address the issues brought on by Gen AI, PDPC will be providing clarity on how the core principles of the PDPA apply to the use of data in traditional AI systems. Yesterday, Minister for Communications and Information, Mrs Josephine Teo, announced that we will be launching Advisory Guidelines on the use of Personal Data in AI Recommendation and Decision Systems for open consultation. We look forward to hearing from industry on the draft.

9. The second issue is the landscape to facilitate responsible cross border data flows.

a. We now have 137 countries with different data protection laws, a marked increase from the last decade. Data protection laws across the world are similar in most underlying data protection principles, yet different in their unique ways.

b. The data transfer landscape is understandably diverse and at times maybe fragmented. While most data protection laws put in place requirements to ensure citizens’ personal data is adequately protected, there is a lack of simple globally-acceptable transfer standards to date. The complex regulatory landscape has a disproportionate impact especially for SMEs that are seeking to expand globally.

10. Within ASEAN, there has been efforts to harmonise data protection standards and facilitate data flows through the ASEAN Model Contractual Clauses for intra-ASEAN data transfers.

a. Building on ASEAN’s collective efforts, we have also worked with the EU to promote mutual understanding of EU and ASEAN contractual clauses for data transfers.

b. We also believe in the value of supporting G2G standards and solutions for such transfers such as the Global Cross Border Privacy Rules (or Global CBPR). GCBPR first started out in APEC, with same set of standards being opened up globally. The Global CBPR brings us a step closer to having a common network, and I am heartened to see many like-minded countries joining the Global CBPR Forum.

11. The third issue I would like to spotlight is in relation to why we do what we do, which is ultimately the impact to a citizen and every single individual on the ground, especially the most vulnerable.

a. Businesses and digital platforms today have the ability to collect vast amounts of data generated through consumers’ interactions and to impact the experiences of all our citizens, especially the more vulnerable around us. Children are more impressionable and highly susceptible to external influences. This makes them the most vulnerable group to personalised content that is age-inappropriate or harmful. How can we build a safe and trusted digital space for children, who are the next generation growing up with this platform technology?

b. Last year, Singapore passed the Online Safety (Miscellaneous Amendments) Act to protect Singaporeans, especially the young, from harmful content online. The Code of Practice for Online Safety, which requires designated social media services to take measures to enhance online safety on their platforms, took effect on 18 July 2023. The PDPC will now be consulting the public on how businesses can implement appropriate safeguards to ensure children’s data is being collected, used and protected adequately. We have also launched an Innovation Call to explore the viability of privacy preserving age estimation solutions that complement existing age assurance methods – such as facial analysis to determine an individual’s age.

Conclusion

12. I share these thoughts on the cutting edge of what I think will be pressing and what we need to get right at the fundamental core - as far as AI and cross border flows are concerned - if we do believe that the velocity of data does drive the future of the digital economy. At the same time, not to forget the impact of what we do on the day-to-day lives of our citizens, especially the vulnerable amongst us.

13. Singapore is not alone in confronting these challenges that have come to the fore. Happy to collaborate with industry, practitioners and data protection authorities to build trust within the ecosystem while enabling innovation in the overall digital economy.

14. With that, thank you for your kind attention, and wish you a most fruitful time at the IAPP Asia Privacy Forum.

Tags: