Language selection

Search

The New Economy Series: Cyber Security (LPL1-V18)

Description

This event recording explores the relationship between cyber security and economic development, the trade-offs faced by Canadian policy-makers, and the role Canada plays in protecting critical public and private assets.

Duration: 01:15:38
Published: February 22, 2022
Type: Video

Event: The New Economy Series: Cyber Security


Now playing

The New Economy Series: Cyber Security

Transcript | Watch on YouTube

Transcript

Transcript: The New Economy Series: Cyber Security

[The animated white Canada School of Public Service logo appears on a purple background. Its pages turn, opening it like a book. A maple leaf appears in the middle of the book that also resembles a flag with curvy lines beneath. Text is beside it.]

Webcast | Webdiffusion

[It fades out, replaced by a Zoom video call with four people in separate video windows. In the bottom left window, the moderator smiles. He is a white man with short brown hair and glasses, and he wears a suit jacket over a checkered shirt. He sits in a home office, with the white wall and closed brown blinds visible behind him.]

Scott Jones: Good day and welcome to our session today of cyber security, and I'm very pleased to be here. My name is Scott Jones and I'm the Head of the Canadian Centre for Cyber Security, part of the Communications Security.

Establishment. I'm very pleased to have been asked by the Canada School of Public Service to moderate this event. This is the fifth event in the New Economy series from the School, and it's a partnership between the Canada School of Public Service and the Centre for International Governance Innovation, or CIGI. This has been a great session so far at the event. I'm looking forward to hearing what our distinguished panel has to say today.

For all of you who are attending, as we all are, remotely, simultaneous interpretation is available in the language of your choice through the portal. And instructions were sent to you with the webcast link, along with copies of today's presentations as well. You are absolutely invited, and questions will be very welcome from you, for the moderated Q&A session after the initial presentations near the end of the event, after about 30 minutes or so. You can submit questions or vote on your favourite questions using Wooclap and there will be instructions shown on screen.

[A web browser fills the screen on the Wooclap event page. Text reads, "How to participate? Web: Connect to www.wooclap.com/CYBER20. SMS: Not yet connected? Send @CYBER20 to (855) 910-9662." The four speakers' video windows reappear.]

So, this session is very much an interesting series when we talk about how do we start to tackle the cybersecurity challenge, and we have some great expertise here to talk about it. It's very much often dismissed as a technical problem, something that "we just need to do this, make this fix to make it go away." But the fact is this is the underpinning of our economy and where we are, and it's only growing in terms of scope and scale. One of the things we've been trying to do as part of the Canadian Centre for Cybersecurity is really start to tackle this at its root. How do we build the partnerships we need so that all of us don't have to worry about cybersecurity as every part of our day and we can move on? But what does this also mean when you talk about data and the amount of information that's available?

There's more information being collected every single day on every aspect of our lives. How does this all fit together? How do we start to tackle these challenges as a community, as a government, as academia, as the private sector, and start to really think these problems through? Because the fact is the traditional boundaries simply don't work anymore. So how do we deal with this problem in a world that doesn't fit nicely into the traditional government levers of legislation, spending or regulation, something that requires a fourth approach?

And we always say that as partnerships, but I'm really looking forward to hearing what our panellists have to say about the issues. But I will first introduce Aaron Shull who's the managing director and general counsel at CIGI, and he'll offer a few words of welcome, introduce CIGI itself, and then also our great panellists that are here today. I'm very much looking forward to hearing from them and thanks to Aaron for orchestrating the session, but also the whole series as part of this program. Aaron?

[In another window, Aaron nods. He is a white man with close-cropped hair and a trim beard. He wears a light blue button-down shirt with a white collar and a navy-blue tie. On the wall behind him is a half out of frame poster for CIGI.]

Aaron Shull: Great. Thank you very much, Scott, and it's a pleasure to be with you. Your point is not lost on me. This is not an issue where we can just install antivirus software and the problem goes away. I think the way you framed and articulated it is really sound, and I just wanted you to know that I really appreciate you taking the time out of your day to moderate this important discussion. So, welcome and thank you for joining us for the fifth session of the New Economy series, which, as Scott said, is a partnership between CIGI and the Canada School. CIGI is a public policy research institute based in Waterloo, or a think tank. We were delighted to take this partnership on.

What I think about this issue is the digital economy, the fundamental pillar of the modern world, a catalyst for increased connectivity, innovation, and it continues to evolve and drive economic growth, prosperity and technological development. However, what brings us here today is that the digital world is more fragile than ever before. There's a proliferation of complex cybersecurity threats that are increasing in number, magnitude and sophistication. Intellectual property is being stolen from companies at an alarming rate. Foreign actors are meddling in elections through fake social media accounts and other nefarious means. More and more critical infrastructure is being digitally enabled, and therefore capable of being digitally disabled. And on top of this, governments and companies alike are falling prey to digital hacking schemes all over the place. The COVID-19 pandemic has also been a thematic lure or subterfuge for malicious threat actors to exploit vulnerabilities, stressing the importance of cyber readiness and resilience as a national economic imperative for an instability within the digital ecosystem.

So CIGI's hope in taking on this partnership with the Canada School was to bring some of the top thought leaders in the country together to address these important issues. We wanted to do everything in our power to support the growth and development of public servants in the strategic knowledge, skills and competencies needed to better serve Canadians in a rapidly changing, tumultuous and uncertain new economy. So, I do hope you do enjoy today's session. But before turning to the speakers, I'd like to first introduce my colleagues that are with us today. Rafal Rohozinski is a CIGI senior fellow and the principal at the SecDev Group where he leads its geopolitical digital risk practice.

[Rafal smiles. He is a white man with short grey-black hair, and he wears a black sport coat over a dark grey button-down shirt. Behind him is a full bookshelf and a red couch. A set of skis and ski poles lean against the back wall.]

Prior to founding SecDev, Rafal served as an advisor to the United Nations and other organizations in more than 37 countries and directed the Advanced Network Research Group at the University of Cambridge. He currently participates in a trilateral track 1.5 working group with the US, Russia and China on the military uses of cyberspace and is a frequent adviser and member of several corporate boards. He's also the author of numerous books, reports and studies examining the digital transformation and the evolution of policymaking and practise in cyberspace, and is a frequent commentator and keynote speaker.

We are also joined by Maithili Mavinkurve, who is the COO and cofounder of Sightline Innovation and is one of the first female founders of an artificial intelligence company in Canada.

[Maithili has medium-brown skin and long, dark hair. She wears glasses and a black and white striped shirt, and she sits in front of a white wall.]

As a member of ISED's Economic Strategy Table, Digital Industries Table, Mai led the subgroup that developed the national data strategy and IP recommendations. She also represented Canada at the G7 ministerial meetings on AI and the Future of Work. Mai was recently acknowledged, and unsurprisingly so, as one of the 30 most influential women in AI in Canada. So those are our speakers today. To say that they bring a wealth of expertise to this topic would be the understatement of this meeting, but Scott, I'll turn it back to you. Thank you.

[There is a short pause.]

Scott Jones: Great. And like everybody, there was a struggle for the microphone or the mute button.

[Maithili and Rafal grin. Aaron's video window disappears.]

Well, thank you for that, Aaron, and thank you for that great introduction. I think the first step will be to turn it over to our speakers. And I think, Rafal, you're going to kick us off with setting the global context.

Rafal Rohozinski: Great. Well, listen, thank you very much and thank you very much to the Canada School for this opportunity. I'd like to say that just listening to both of you and Aaron make your initial statements, I was reminded of that "Bat

Out of Hell" song, "You took the words right out of my mouth!"

[Maithili and Scott chuckle at Rafal's singing. Rafal's video window is pinned and it fills the screen.]

So thank you for that set-up. Look, I think it would be a gross underestimation to say that we are in the middle of a transformative moment in human history. In the past 20 years, more than two thirds of humanity have gained the ability to communicate on a planetary basis. The digital economy creates over 15 trillion in value every year and is expected to account for 26 percent of global GDP in the next five years. New and emerging technologies such as 5G and artificial intelligence are going to continue to change the landscape beyond recognition. With 5G, we not only have a ubiquitous and pervasive Internet access connecting all devices and systems and bringing into being new economic models, but we'll also see the gross expansion and the creation of data. By 2025, the world is going to be producing almost 175 zettabytes worth of data. That's five times the amount today, which is already double and triple the amount of only a few years ago when, if you had taken every single grain of sand on the earth and multiplied it by five times, you would get the approximate amount of data that was already being created.

The oceans of data will feed advances in artificial technology, which will increasingly transform the nature of working institutions. By 2030, artificial intelligence is expected to contribute 15 trillion dollars of economic value. The size and scope of this transformation is difficult to fathom, but it's coming. It's also important to recognize that the tremendous interdependencies that these technologies will continue to propel between countries. Innovation just doesn't come when you have a bright idea and where they're born, but it's when economics make it cheaper to replace existing processes, new approaches that things really take off.

That's the case with technology; 5G artificial intelligence and these data oceans I just talked about are going to drive innovation in places where the economics of deployment are most advantageous. That's going to be in high-density areas and regions, megacities and yes, Asia. And while Canada, the United States and Europe may continue to lead in terms of the development of technology, the new applications and new economies that scale to a global level will best come from places like Asia and megacities, and we will at best become participants and consumers in part to this next wave of innovation.

So, what does all this have to do with cybersecurity? Well, I think it's important to recognize that the foundations upon which this tsunami of technological change is coming are based on a technology that just turned 52. Now, if I was to turn the camera on myself, I think I'd recognize that my 52-year-old self is very different than my 20-year-old self in terms of capability and outlook.

[Maithili and Scott's windows reappear.]

And this is particularly true for the Internet, which, and I'll say it out loud, was never built with security in mind. The Internet was built for resilience as a global Esperanto to connect together devices and networks. If you had attended a meeting of computer scientists or engineers just 10 or 15 years ago, the principal objective was just to make things work. Securing them for potential malfeasance or criminal intent or, as we have just seen in the last 24 hours, the indictment of Russian intelligence operatives by the US Department of Justice-directed state activity, these were distant considerations.

As a result of the last decade, we've seen the mounting costs of an infrastructure that was never built with security in mind. We've witnessed the rising tide of cybercrime and more recently, ransomware that has caused tremendous damage. The weaponization of code and it's used by state actors has cascading global effects. Stuxnet's target may have been Iran's nuclear program in Natanz, but the sons and daughters of Stuxnet have caused billions of dollars of economic losses as a runaway attack code, sometimes pilfered from the national security actors under whose stewardship it had remained, have been released.

Just one example is the NotPetya attacks that were used against Ukraine, but which also impacted on Maersk, the international shipping giant, inflicting over a billion dollars in total losses, and leading to a 20 percent decline in Maersk's global market. The code originated within the National Security Agency before being stolen and reappropriated by the Russian security services and let loose in the global data ecosystem.

So in this context, what do countries need to be doing about cybersecurity? How should they be thinking about it? Well, first of all, we need to recognize that cybersecurity is not just about the protection of network, digital devices and data. Cybersecurity is about the defence of an increasingly complex and interconnected digital society with cascading effects that impact from network failures, institutional human gaps and represent significant growing liability. We need to reconceptualize cybersecurity as digital resilience: a set of policies, practices and capabilities that help us anticipate, prepare for, prevent and respond to the inevitable crises and disasters that will depend on and impact our increasingly digitally dependent society.

Digital resilience needs to take into account the linkage between societal values, priorities and the means to secure digital foundations. And in this respect, digital resilience has three principal enablers. The first is cybersecurity, which is dependent upon creating a national strategy and putting in place the authorities, permissions, budgets and capabilities that ensure a coordinated governmental response that impacts all digital systems at the national level. This includes computer emergency response teams, computer incident response teams, as well as a broad awareness building and skills development. For the most part, we've gotten this part right. The development and the emergence of the Canada Centre for Cyber Security, an investment into a national cybersecurity strategy, has done much to create the basis for a broader cultural cybersecurity approach across the public sector and what's been identified as critical national infrastructure. The latter needs to be broadened, and the use of information sharing centres and arrangements has to apply to all industrial sectors to provide awareness, knowledge sharing and the capability sharing as appropriate to all sectors and all levels of society.

The second key enabler of digital resilience is a focus on business continuity. The future of all crises will be varied, ranging from climate change to natural disaster, but planning for and anticipating and managing the consequences is something that can be achieved. For governments, this means aligning national crisis capabilities with responding to deliberate and accidental cyber incidents. For business, this means changing existing risk models in order to incorporate digital risk as part of their value at risk calculations. Businesses will be breached. Data will be lost. Assets will be ruined. But enterprise risk frameworks that anticipate these risks can be adjusted to use a variety of levers, including insurance and other third-party risk transfer mechanisms to make these crises manageable.

[Rafal's window fills the screen.]

The third enabler of digital resilience is a strong data governance and privacy approach. Confidence in the integrity of information and data will only grow in importance as systems and processes are digitized and as technology, such as artificial intelligence, starts to mediate many of the functions currently served by people. A trusted digital ecosystem is essential as without it, confidence in the economy, public institutions will be strongly impacted by the inevitable crises to come.

Underlying these three enablers is the need to grow a new generation of digital citizens. This will require significant and sustained investment into human capital. Advanced economies are experiencing an acute shortage of digital skilled workers. Upwards of 800 thousand jobs in the IT field remained unfilled in the EU in 2020. The World Economic Forum estimates that 133 new job types and professions will emerge as a result of the division of labour between humans, machines and algorithms. By 2022, an upwards of 54 percent of all workers will require significant reskilling and lifelong learning to acquire and maintain basic digital skills.

Now, realigning development priorities around resilience is a core component of digital transformation and will also require us to reimagine citizenship for the digital era. In a digitally connected world, social data platforms and services impact on the scale and quality of civic discourse and the participation in ways consequential to our representative government. Reimagining citizenship for the digital age will take into account, or will need to take into account, governance that currently occurs at a local, national and global level. Future digital citizens of Canada will be subject to rights and responsibilities that will be exercised within their national jurisdiction, as well as guided by the terms of service of global data platforms. Developing the right balance between these multilevel rights will be important and fundamental to citizenship and governance in the digital era and to stronger and more resilient digital societies.

Now, we need not look any further than the experience of the past six months under COVID to recognize just how much these enablers are critical to the functioning of society under crisis. In the past half year, digital onboarding has accelerated and we've achieved more since March than we have in the last 10 years. COVID has precipitated the world's largest natural experiment and lessons have been learned. Now, in this context, the experience of South Korea, a country that has thus far successfully avoided the worst of COVID-19, is perhaps most noteworthy. South Korea was one of the first countries to register COVID-19, but it's also flattened the curve in a stunning fashion. Among the many factors that contributed to their relative success are the way that public officials, scientists, citizens collaborated to detect, contain and treat vulnerable people. It also rapidly elaborated contact tracing systems and surged health care capabilities, including to hotspots. But the secret of South Korea's success also resides in its digital resiliency.

The South Korean authorities invested heavily in screening and diagnosis, and rolled out health check apps and GPS tools to monitor and enforce quarantines. The government worked with the private sector to repurpose existing data technologies and services such as CCTV and sensor infrastructure used to previously monitor traffic and pollution. Within weeks of discovering the virus, South Korea had mobilized digital tools to improve diagnostics, strengthen telemedicine and make data available to improve domestic awareness and response. It was not just developing these innovations locally, but has now also starting to export them around the world. As a result, South Korea has managed to recover and is in a positive position to take advantage of the burgeoning digital economy. The government has greased the wheels of its economic recovery by making rent relief and financial credits available to small and medium enterprises. And a number of ministries stepped up their cooperation with the private sector to do everything from dramatically expanding production of testing to strengthening outreach and support for remote working. Meanwhile, the government has also supported online schooling opportunities and provided over 33 thousand smart devices and free Internet services to low-income families and students.

Now, South Korea's experience is a stark reminder of the importance of cultivating digital resilience in a digitized world, the ability not just to ensure the continuity of services, but also to bounce back. Learn and improve: that's the mantra of smart governance in the 21st century. It's not enough just to try to survive crisis and keep services functioning, including online. Digital resilience is a down payment on a more secure future, which explains why South Korea is racing ahead to build a new digital deal organizing around 5G, artificial intelligence and data protection. Now, not every country can be South Korea, but we can learn from the experience. And one of those lessons is to recognize that digital transformation and digital resilience go hand in glove. Although uncertainty is inevitable, future pandemics, climate change and digital risks are not. Thank you.

[The other speakers' windows reappear.]

Scott Jones: Great. Well, thank you, Rafal. Mai, over to you.

[Maithili smiles.]

Maithili Mavinkurve: Thank you, everyone. It's great to be here. So I think we're going to have some slides being displayed as well that will be shared shortly.

[A slide show presentation fills the screen. A black title card with the CIGI logo reads, "The New Economy Series: Cybersecurity. Maithili Mavinkurve, COO and Co-Founder, Sightline Innovation. Rafal Rohozinsik, Senior Fellow, Centre for International Governance Innovation, and Principal, SecDev Group."]

Awesome, thank you very much. Great. Well, good afternoon everyone, and thanks for the opportunity to speak with you. Perhaps we could switch to the next slide, please.

[The first slide is titled "The History of Enterprise Computing." Seven circles of various colours each have a different technology written in them, with a line pointing to their era. They are as follows:

  • Big Iron: 1990-2010
  • Mobile: 2006-Current
  • Cloud: 2009-Current
  • Big Data: 2009-Current
  • AI: 2009-Current
  • IoT Blockchain: 2011-current
  • Data Sovereignty: 2017-BEYOND.

Text beneath this graph reads, "Technology is moving rapidly, breaking things and innovating. A perfect storm is brewing. Technology is challenging our fundamental understanding and values of sovereignty and freedom."]

A lot of change has happened, a lot of change has taken place in computing and technology enterprises over the last few decades. It's clear that technology is changing at a breakneck speed; those in technology will tell you that even they can't keep up. In the past 20 years, we've seen mobile cloud, big data, AI, IoT, so many buzzwords disrupting almost every single industry and transforming our day-to-day lives. Due to COVID, we've seen entire businesses and schools move online to the cloud. More apps are collecting our health and location data to provide us with contactless services. It's undeniable that technology is one of the most powerful tools for creating a resilient and strong economy. However, we're also seeing that along the way, it's also creating a lot of friction. Next slide, please.

[The next slide is titled, "Timeline of Global Data Governance Action Since Equifax Data Breach, Cambridge Analytica Data Scandal and GDPR Enactment." A graph depicts various companies' data breaches from 2017 through 2019, which are all followed by various data protection acts from the UK, Asia, Canada, the US, California, France, Australia, and other regions. Text below reads, "79% countries with e-transaction laws. 52% countries with consumer protection laws. 58% countries with privacy laws. 72% countries with cybercrime laws."]

In the span of a few years, between 2017 and 2019, we saw a massive shift taking place in the world of data and digital. Companies were experiencing data breaches. And then I think a turning point as well for government and citizens—the infamous Cambridge Analytica scandal. What we saw with the Cambridge Analytica event was the raw power of data and AI at scale really having the astounding ability to sway opinions around an election. I think at that moment, the digital and real worlds collided in an ugly way. In recent history, in technology, we're seeing it spark debates about ethics, privacy, free speech, misinformation. We're seeing deepfake videos, surveillance, algorithmic control—I mean, the list is just endless. But what we've also seen is that in parallel, large technology companies are thriving and creating immense economic value by utilizing this data to understand users better and being able to sell products and services, and obviously creating a lot of convenience for us as well.

So, all this to say the data economy is obviously in full swing. Data and digital have become a currency for success, but I do think it's been at the expense of trust. This economy demands a renewed urgency and importance in cybersecurity. Technology, I believe, is challenging our fundamental values and understanding of rights and freedoms in the digital world. Cybersecurity is what's going to make sure that we restore that trust and create the digital world we actually want. Next slide.

[Text on the next slide reads, "Big shifts are occurring in the global technology ecosystem. Canada must respond swiftly in partnership with allies, public and private sector entities." Beside this, a blue circle is split into five horizontal sections. Each one contains a line of text:

  • Data has become valuable. Big tech has a head start.
  • Increasing demand for strong data governance by consumers.
  • Weak data governance is impacting customer loyalty and citizen trust.
  • Governments must race to regulate data markets, protect citizens and push for standardization.
  • Growing asymmetry exists between digital haves and have nots."]

We've seen a lot of shifts in the global tech ecosystem, and as the lines between the physical and digital worlds become increasingly blurred, governments, I believe, are in an interesting position to now define what their citizens' digital futures will be. The same protections that we expect and need in the physical world to maintain our freedoms, we need those in the virtual cyber world as well. We have to be proactive and define what is law and order in the online world. We have to expand our thinking that cybersecurity is not just about keeping our systems and information safe from attack. It's not just about defence. It's about how we define, uphold and enforce the rules and values of Canada in the online world. The definition of cybersecurity itself perhaps has to evolve to include more than just security. It should include trust, policy, cyber law, cyber economics, anything and everything in between. What does our cyber society look like?

So, what do we do about all this? How do we not just protect our citizens and our industries, but how do we make Canada a strong, secure digital nation? Next slide, please.

[The slide's title is "Canada's Cybersecurity SWOT – Strengths." Text below this reads, "Trust and Fairness is embedded in our values system. We are leaders in:

  • Artificial Intelligence & Ethical AI
  • Trust Frameworks and Data Trusts
  • Digital Identity
  • Standards
  • Public-Private Collaborate Innovative Leadership"

Text at the bottom reads, "We have an obligation to ensure there is a proper world order and value system in the digital world."]

What I put together is a SWOT analysis of sorts for Canada. Globally, Canada is seen as a trusted nation. Trust, equality, fairness. It's embedded in our value system. And this is an extremely powerful foundation from which to build. In all my experience and work, I've had the privilege of working with federal and provincial digital leaders, as well as working over 15 years in the technology ecosystem here. We are leaders in AI. We're working hard to define ethical AI practises, digital and data trust frameworks. Our standards bodies are working hard to create governance and standardization around all of these important topics that will shape how our future digital world operates. As well, we have a unique innovation ecosystem that fosters public and private collaboration. Next slide.

[The slide is titled, "Canada's Cybersecurity SWOT – Weaknesses." The list follows:

  • Privacy Regulations, Frameworks, Policies need to be updated
  • Heavy reliance on Foreign firms
  • Heavy on research not on large global multinationals technology enterprises
  • Cybersecurity Talent
  • Lack of Cyber resiliency across many of our sectors."]

Our weaknesses, though, when it comes to having a strong posture, is primarily around privacy regulations, our policies that relate to digital and data. We have some work to do. We continue to have a heavy reliance on foreign tech firms and we continue to invest heavily in research and not enough on creating some large technology enterprises here. Our cybersecurity digital talent gap, I believe, is still large, and that just all leads to a lack of cyber resiliency across all our sectors. And this can be a vulnerability for us. Next slide.

["Canada's Cybersecurity SWOT – Threats." The list reads:

  • Foreign actors both state and non are aiming to undermine both our economy and our democracy
  • Hasty decisions with 5G network deployments
  • They collect data in unlawful ways both invasive and non-invasive methods
    • For example, to get access to Canadian data, enemy states can use state owned and foreign tech firms to collect data via apps and partner with our university researchers
  • Large datasets can reveal patterns and trends in human behaviour, which help with intelligence and propaganda as well as surveillance"]

Despite this, I'm extremely excited that we're well positioned and we can be ready for the digital future. Domestically, we have to start by recognizing the immense opportunity in the data economy. Each industry has the opportunity to transform itself, and we must enable them to do that in a secure way. In my work at Sightline, we're on the front lines helping different industries from cities to Ag leverage their data assets in a trusted and responsible way. We have to invest in consumer education, upgrade our policies, our regulations, and globally, I believe we have to maintain a leadership in AI. And we have to make a shift here to move from research to commercialization and ramp up aggressively on the governance and regulation around that. We have to uphold the values of Canadians and stand up to things that threaten our digital or physical democracy. Next slide.

[The slide shifts.]

The threats we face—next slide, please.

[It skips to the following slide.]

Oh, sorry, go back, sorry. I apologize.

[The presentation returns to the previous slide, titled "Canada's Cybersecurity SWOT (O)." A line reads, "Technology, politics, and economies are now intertwined. All these stakeholders must come together to stake our claim in the digital frontier." A table is split into two sides, one titled "Domestically" and the other "Globally." They are as follows:

  • Domestically
    • We have massive core industries that we can support and transform in a secure way
    • Cities, healthcare, agriculture – Sightline is solving big problems through data/digital collaboration (via Datatrusts)
    • Encourage business model transformation – view data as an asset
    • Upgrade our laws & regulations: look to CPRA and GDPR
  • Globally
    • We must maintain our leadership position in AI but transition it out of research
    • We need to be strategic about our allies and our own interests regardless of how things go with our southern neighbour
    • We must uphold the values of Canadians
    • We must stand up to values that threaten our digital democracy]

The threats we face are real: breaches, ransomware attacks, malware. These will only increase and as we expand into 5G networks, the risks can grow even more. And we will see threats not just to our businesses, but also to our democracy. Increasingly, though, non-invasive means, for example, are particularly nerve-racking for us and important for us to take a look at. It's possible for an enemy to create harmless means by which to collect data about Canadians to manipulate us unknowingly. The same thing large tech companies have done with our data to make money lawfully could be done lawfully by enemies for unlawful or unethical manipulation and surveillance purposes. Next slide.

[The slide is titled "Canada's Ultimate Opportunity – The Untouched Realm of Data Trusts and Data Sovereignty." A circular graphic is labelled "Sightline Innovative Datatrust." It depicts various images of people withing with technology. These are labelled: Trustee, Policies, Data Consumer, Data Protector, and Assets. Each one has a double-sided arrow connecting it to the next. To the right of this graphic is a map of Canada filled in with a Canadian flag. Text beneath it reads, "The Data Trust Capital of the World. Canada is well positioned to lead global data trust governance for the digital era. Cybersecurity is central to our long term prosperity, and maintaining public trust between governments and citizens, corporations and customers, and NGOs and stakeholders."]

So, work in Canada plays a unique role. I think there's a realm of data sovereignty, data governance, data trust. This is all an untouched arena where Canada has an advantage and we can create and explore new models for building out and creating this economic prosperity. We have a unique opportunity to position ourselves as leaders in digital and AI governance, and I believe that cybersecurity is central to our long-term prosperity. Next slide.

[A five-item list is titled, "Canada's Call to Action." The list reads:

  1. "We must become a "Trusted Digital Nation"
  2. Invest in data utilities, collaboratives and trusts
  3. Aggressively maintain our leadership position in Ethical AI
  4. Invest in homegrown hardware and software security solutions
  5. Close the skill gap"]

So finally, what is our call to action? What must we do? The chaotic climate that we see today around global politics, the pandemic and the speed of technology change, I believe it's all exacerbating the need for an entire recalibration. So let's set our North Star to be that trusted digital nation. Let's invest in our industries to transform digitally, perhaps explore new frameworks for creating data, utilities, collaboratives and trusts—things that are already underway. These are initiatives that we're seeing across the country. And as I've said many times before, Canada is a leader in AI. We have to continue our efforts and focus while defining these guardrails. We have a lot of smart people and a lot of smart businesses. So let's cultivate and invest in Canadian tools and technology as well. And lastly, let's close that skills gap. We have to invest in cyber literacy. I believe that the cost of inaction could be the loss of the very rights and freedoms that we currently take for granted. So thank you all for the time. I hope you enjoyed the presentation.

[The slide closes, and Maithili, Aaron and Rafal's windows reappear.]

Scott Jones: Great. Thank you both. So many rich ideas in there to explore. Excellent presentations. I loved how we ended with and talked about all the opportunities as well, both of you did. So, thank you both. I'd just like to remind all of our viewers, participants in the session that you can submit and vote on questions through the Wooclap platform. The link should be appearing on your screen now or within the next few seconds. So while we wait for your questions to come in, this is your session, after all, I will maybe kick off the discussion just picking up on a couple of themes we just heard. You both talked about the global environment we're in, the technology environment, but also that we're part of a bigger ecosystem, whether that's one where we suffer the effects of a global ecosystem.

[The Wooclap website page reappears, with the directions to go to www.wooclap.com/CYBER20 or text @CYBER20 to (855) 910-9662.)]

One of the things I've heard, for example, from industry around the world is that the Canadian market is small. So when we make demands or we request things, we're 37 million in a worldwide market of six billion, etc. and why GDPR, for example, was really successful was because of the size of the European market.

[The page closes and the speakers' windows reappear.]

Market power can force things, people to comply. How do you see that playing out? And how would you leverage some of those things—like we are a leader in AI, we are working on ethical AI and I see it referenced around the world, the work we're doing. But how would you see that playing out in the global context? And what would Canada need to do to make some of these things successful?

Maithili Mavinkurve: I can dive in. Thank you, Scott, for that question. Like I said before, Canada's been a leader in AI. We were early adopters on the research side and a lot of the work that I'm involved in on both the federal and provincial levels as well as with the Standards Council, we're really pushing for the responsible AI aspect. I think we have to really look at how can we do more there with the EU. I think we have to collaborate a lot more on what those data regulations look like. We need to, I think, overhaul—well, maybe not overhaul, but really need to upgrade PIPEDA. I think we're really lagging behind. And we need to look at the interconnectivity between just AI and derivative data. So, not just raw data. And I think that's kind of a key aspect here. PIPEDA and GDPR, all these really look at personal information—how are we protecting personal information? But what about the implications of when that data is combined with other data sets? Are there new frameworks from a technology perspective as well? I think we've got immense credibility in the space around ideas of how we can have better trusted governance through technology. So I think we have to look at what really are the strengths that we have—what can we bring to the table and where do we need to partner with other companies? But absolutely on the AI side, we have to continue the work that we're doing and prove it out, I think, in real-world scenarios, not just from a conceptual policy perspective.

[Scott nods.]

Rafal Rohozinski: I think, Scott, that your question actually raises a little Pandora's box of things that we can explore here. First of all, I think we have to be very clear that all of these technological advances are going to face us with some really challenging ethical dimensions. What is it that we want to be as a society? What kinds of choices do we want data and technology to make for us? Of course, right now, we're dealing with the consequences of socially mediated data on social media platforms. But AI and oceans of data and computational powers are going to make huge changes in the ethics of health care, that are going to make huge impacts in terms of the kinds of decisions that are previously made by individuals—the whole question of autonomy. And I think there we do have a problem, Scott, in the sense that 37 million voices, united or disunited, are a drop in the bucket against six billion people on this planet. We have to be recognizing the fact, as I said during my talk, that although we may remain leaders in the development of technology, the application of technology, the norms around the application of technology, that's going to be developed where the economics make it real to do, and that is going to be in the high population densities, cities and countries.

But there's another aspect to this, and that is how we use institutions as a means of being able to redress the balance between our population size and coming up with global norms. And I think there, the global norms debate over AI starts becoming tangible, fungible and real. I think we have to recognize that in cybersecurity, in the governance of this technical commons, we've had quite a failure. Look at it this way: between the time that Sputnik was launched and the time that we had a full-fledged UN office for the governance of outer space, 10 years eclipsed. It was fast. And yet in the 24 years since Al Gore invented the Internet—ha ha—we have yet to see some form of consensus around the global governance of this technology. Yes, the multi-stakeholder approach has allowed this technology to propagate globally in a way that perhaps no other technology has. But it also has left us with this policy conundrum, these ethical questions that don't have a natural home on the stage where those decisions could most naturally be made within the UN, within the World Bank, within those institutions that give voice to those that otherwise numerically don't.

[Maithili nods emphatically.]

Scott Jones: So there's a lot there again, and I agree with you about the Pandora's box because I think both of you talked about our values and the Canada we want in the future. But one of the things I think that's also become clear, especially when we talk about the Internet and Internet governance and other nations, is our values don't necessarily align with other countries. There are competing sets of values out there for the Internet and particularly around things like global standards, which I don't want to get into and jump ahead in the series. But I do think that's one of those things where Canada's values of data protection, data privacy, wherever we are with our laws, et cetera, align with perhaps Europe better than they do with some of those other parts of the world.

I'm wondering, how do you see that playing out, especially in light of some of the things like our leadership in AI, the investment we've made over—because this sector didn't appear overnight. This is a sector where Canada has been investing for 20, 30, 40 years, building up expertise, intellectual capability, new ideas, trying things out. How do you see that playing out in terms of what our ethics are, what our privacy and what Canada's values are versus where we are in the world? Are we unique and mostly alone or are there allies out there that we can work with, in your opinion?

Maithili Mavinkurve: I can comment more on the national level, and Rafal, obviously, is an expert on the global perspective, but I fundamentally believe that we are in a unique position. I believe that our values really are aligned well with defining what ethical AI needs to look like and what data governance—how that's going to play out. But to Rafal's point and a point that I was trying to make earlier, we have to get away from the research, the concepts, just the talking. We really need to have better collaboration in trying out some of these ideas. And that's what we're seeing. We're seeing in the past 10, 20 years, technology companies—and I'm a tech company founder—they're going to push the envelope and they're going to do things that maybe they didn't even intend to do. And it really is this—I won't go as far as saying Wild West, per se—but it really is that sort of scenario that's like a perfect storm. So we need to test out what frameworks, what values are going to work. How are we going to actually embed this in real applications? This is not going to just be a checklist for saying, "Do you meet the regulatory requirements?" I think it needs a complete shift in thinking about "how does digital and data impact me?" And so it's going to be at all levels. It's going to be just plain old citizens, consumers participating all the way up to leadership. And I think it kind of brings a point around ethical leadership as well. How do we embed ethical thinking in everyday operation of businesses? Because right now we think, "OK, well, I've got a security team there doing their cybersecurity stuff. We're good to go." And so I really would like to see us thinking more about ethical leadership in how we move companies forward in Canada, whether it's AI or even just data-related. So, please, Rafal, I'd love to hear your thoughts as well.

Rafal Rohozinski: Yeah. First of all, I'm in violent agreement with everything that you said, but maybe I'll tie it back to an earlier precedent, to something that Scott was saying. In the last 15, 20 years, we've had the emergence of essentially two blocks that are vying for alternative views on the governance of the Internet space. The like-minded, which are really the European countries, plus Canada, Australia and others, and then a bloc that is being broadly represented by China, Russia with a number of other countries. The challenge between these two blocs has been as follows. The like-minded, meaning us, have always said we need to keep our values front and centre and they're just as good the way that they are. Whereas people like the Russians and Chinese have been saying, "technology is changing fast. This has impact on our societies. We need to be thinking of new models." Now, I'm not going to say that there's a good or bad approach here. I personally sit in our camp, but the lack of being able to take a step forward and try to conceptualize what does a new Internet governance look like to being able to take the step forward and testing ideas, whether those are ethical propositions or practical oppositions, has meant that the like-minded have lost voice and credibility. And we cannot afford to have that happen in areas like artificial intelligence, data governance and elsewhere.

Look, this is a marketplace of ideas. And I don't know whether it's Gordon Gecko that once said it, "When I need ethics, I'll buy them." But to some extent, you have to recognize that whether we like it or not, the possibilities that these technologies have brought into bearing, the fact that they are now cheaper to employ than keeping the status quo means that innovation is going to happen in a wild way. If you want to use stem cells to cure yourself of diseases, I can give you a half a dozen countries you can go to and get them, ethical considerations around the use of fetal stem cells notwithstanding. That's only going to multiply with technologies like artificial intelligence that doesn't require the same kind of physical capital to move. I think we have to be ready for that and we have to be ready for that in a very significant way.

Scott Jones: Yeah, and I think that was a very provocative way to end this and the challenge we face. And it goes to one of the questions that comes from the audience, which really is that continuing the work and the development and commercialization of artificial intelligence techniques and applying them really does rely on getting data, whether harvesting, scraping collection, et cetera. And a lot of times it's done by individuals that might be aware of where it's being used, to the point that was made earlier about it's not the raw data, it's sometimes the secondary use, what you can infer from the data, how you can build profiles. And the question really is, how can Canada encourage the growth of this industry, continue to have the information available so that the research and the work and commercialization can happen but at the same time respecting the privacy rights of Canadians? And one of the examples I would use is COVID alert app—an app that was designed fundamentally from scratch, based on privacy, protecting the rights of each individual Canadian, involved privacy commissioners from multiple parts of Canada. But yet there's still this underlying suspicion that because the government worked on this, that it has to be collecting information. But yet we give far more to any social media platform any day of the week when we turn it on. So I'm not sure if Rafal or Mai, you want to jump in first. I'll just leave that bomb right on your desk.

[Rafal and Maithili smile.]

Maithili Mavinkurve: Yeah, that's definitely a bomb! I think this is going to be an evolving scenario and the collection of data from social media companies versus government. It's an ongoing saga. And actually I would tell everyone, if you haven't watched this show on Netflix, it's called Social Dilemma. It's pretty interesting, quite eye-opening in terms of the data that's being collected about us and slight changes in behaviour that are happening. But coming back to your point, I think there's a lot of negativity that has obviously been created around—AI relies on data and there's harvesting and scraping and it's just these really terrible terms. But there's a lot of good as well that can happen.

And I think the concept of having informed consent is quite critical. That's why literacy around how technology and tools work is important. We are happy to share information with our doctors. There's certain context in which we have trusted conversations. We have legal boundaries around how that information is shared. We need to start to look at how that transfers into the digital world. Are we able to create constructs? There's a lot of conversation going on right now around data trusts. So, the ability for us to have data in raw format with informed consent to be shared and governed by a trusted—whether that's a third party or not, to be able to have these collaborations with data. There are also tools and technologies around actually anonymizing data. These are still very, very early technologies and we're going to see how they play out. But the more and more that we have guardrails put in, I think, in these applications, consumers are going to feel more and more comfortable. But I still think that informed consent is a big piece of the puzzle to be able to make this work.

Rafal Rohozinski: Yeah. I'd add to that, look, if you're concerned about social media platforms, wait until 5G. And that's just because the ubiquity of everything becoming connected as a platform for the collection processing of data is going to be such that the issue of informed consent, unfortunately, is going to become a lot more difficult. When I as a subject using OC Transpo, walk through a city, use my bank and everything is centralized on the devices that sense my presence, then how much informed consent is there? I think this is a larger question and it is something that I touched on in my talk. We need to be moving to defining what is a trusted data ecosystem at a national level. And that includes being very specific in terms of defining data rights, multiple data rights, some which are transactional, some which require consent, some which may be seen as a public good. But we can't try to lawyer our way through this by just looking at privacy is the single lens through which we understand the role of data and society. Data is a resource. Data is a transactional token. Data is an infrastructure for our emerging digital age and society. And I think it requires that kind of approach in order to be able to get at the heart of that question.

Scott Jones: I have the feeling that we could probably talk about this subject for another few hours, but time is absolutely flying by with this conversation. So I was wondering if we could maybe switch subjects just slightly. You both talked about the cyber skills gap, the need for both the citizens to build up their skill sets, digital awareness, digital resiliency, but also the public sector faces the same challenges in terms of the public policy challenges we face, the use of data in in governance, the ability to make decisions. So, what are your thoughts on what Canada needs to do to better close that cyber skills gap? How do we build off of some of the successes we've had but really start to make this something that citizens and public servants can care about and address? Maybe, Rafal, do you want to start?

Rafal Rohozinski: Yeah, sure. I alluded to it in my talk, but I think we need to go beyond simply understanding digital skills and recognize that we need a redefinition of what is citizenship in the digital era. And that requires a root and branch understanding of that from the way that we teach our children, the schooling that they get, as well as the approach that we take to ongoing education and skills training in the public and the private works sense. I've got two young kids and I can tell you that we've spent more time or we have more support in teaching them how they should brush their teeth through Murphy the Molar or crossing the street through Elmer the Safety Elephant than they do in terms of understanding a technology that currently, under COVID, they spent between eight to 10 hours a day engaged in. This is a fundamental issue. And I think being able to understand that from the citizenship perspective means that we have these runs through which we can address it, starting with education and ending with lifelong learning.

Maithili Mavinkurve: Yeah, I would definitely agree with you, Rafal. I also have two young children and they are digitally savvy. They are digital natives. They understand how to use technology, they may not understand how it works. We've seen a challenge on STEM and pushing kids to go into STEM, pushing women to go into STEM, all these things. There are ways for us to invest better in high school, in elementary school, middle school, universities. We need to start building in—and I think building in not just cyber security or digital literacy, but also ethics. We need to cultivate that curiosity around how technology works, just like how my kids will say—they'll watch an ad on TV or on YouTube and we get them to question, "Don't believe every single thing you hear or read. Go do your research, go try and understand it." We need to start cultivating this type of digital, as you say, citizenship. What are the obligations of us as digital citizens? We cannot only just be consumers of information and services; we have to question. And so it's going to take a lot of effort at all ages. And certainly in the public sector, I would really encourage collaboration with industry. Get out there, learn, talk, connect as much as possible with those in the technology sector, as well as in other disciplines like law and finance, and start to have these types of conversations to really understand what are the implications of this type of tech.

Scott Jones: Great. And on that—I was actually going to save this question for the end, but both of you have been very successful entrepreneurs in setting up some excellent organizations in this country. With that, how would you see us engaging? One of the things—it's always difficult as a public servant. As a senior official of the Lobbying Act, I have to record all my interactions. It is quite the disincentive to want to go and do this, but those laws were created for a very good reason. What would work for you? I've heard some folks tell me, "Look, we like practical outcomes. Come to us with a real problem. We'll work together on it, set an end date. We have no interest in being on a committee that never ends," which I will admit sometimes the government can be very good at creating. "On the other hand, none of these are going anywhere. So I don't have time and I've got a business to run." And so you kind of get the two extremes. What would work? What works for engaging, other than doing one of these great sessions with such great speakers? What would work and make it easier for you both as leaders in your fields to work with the government?

Rafal Rohozinski: Maybe I'll start again. You know, it's once been said that any technology sophisticated enough may as well be magic. And I think part of the problem is that policymakers don't do magic particularly well. Certainly, one of my big frustrations over the last 10 years has been that trying to engage the political level, in particular, has been difficult because these issues have been so esoteric, so disconnected from the kinds of things that they talk about with their constituents, the kinds of things that help them win elections, that really there was no priority for them to be able to set direction.

Look, for everybody online here who's a public servant, you recognize that we live in a federal bureaucracy or a government bureaucracy that's departmentally focused. We don't do interagency particularly well—certainly not as well as some of our peers. And I think that's created a problem for people like myself, that an issue which may actually touch on multiple departmental jurisdictions doesn't have a natural resting place. I can talk to public safety about one issue, but it's actually the Justice Department that needs to be answering the more fundamental question. ISED may be very important for one piece of the puzzle, but it's actually the Department of National Defence that I need to be talking about for others. So I think there's a real need to start thinking not just in terms of joined-up government as a concept, but making it a practical reality.

I'd say that in the last two years, things have gotten better. I think an awareness of the importance of the digital economy, a raising consciousness around the severity of cyber threats—and I'm not just talking breaches, but people remembering back to the fact that, and you mentioned this, Scott, we may have now put two decades into artificial intelligence, but prior to 1992, we had four decades into telecommunications, which got rolled into Nortel. And Nortel got owned by the Chinese. We gave it away. It wasn't the Avro Arrow that we simply decided to cancel. It was stolen. So the understanding of the severity of cybersecurity, I think, has opened a lot of eyes. But I think a lot more work needs to be done. And, again, I think that joined up nature of government, thinking about these problems across the whole of government rather than departmentally is really the major reform that would make things easier.

Maithili Mavinkurve: Yeah, I'd say that in all my experience talking and working with government, being involved in a lot of organizations that are participating in government discussions, one thing I have noticed is the quality of the conversations has gone up immensely. Just the understanding around digital and data have gone up considerably. And I think that's a great sign that we're headed in the right direction. One quick thing I would say in terms of how can government interact better: don't compete.

[She chuckles.]

Don't compete with the technology companies that we have here. You've got to collaborate. You've got to do it together. So that's really something I would encourage. Don't try and reinvent the wheel. We've got great companies and people within industry and we want to help. We want to work together. So that's really one area that I would say we need to do better at. Work together, collaborate.

Scott Jones: Great, thank you. Thanks for that. There was a bit of a selfish reason for asking that question as well, so always appreciate hearing the advice.

[Maithili laughs.]

I think if we maybe it changes a little bit towards some of the fundamental concepts in cybersecurity, when we talk about things like protectionism, meaning people are using that to apply in terms of cybersecurity, meaning, "we'll keep the data in Canada. It'll be safer here." And only do that. You talked about how the global environment technology shifts where it's economical, including the commercialization earlier and many different areas. So how should Canada start to approach the fact that we're in a global Internet? There's a resiliency issue when it is located somewhere else. You can always have the power of a state invoking itself when you're talking about actual physical asset. And how does Canada start to think about a digital resiliency in a global world which fundamentally is becoming less and less certain than what we would have said even two or three months ago sometimes, but let's say years ago, in terms of the certainty of how we expected the world to evolve? I know that's kind of a loaded big question, but I'm curious to hear your thoughts on what Canada can do to make sure that we're prepared for a world where we are going to be connected. Our data needs to be there because that's how our cities are going to run. That's how our lives are going to be made better.

[There is a brief pause.]

Maithili Mavinkurve: Rafal, you want to go first?

[She chuckles, and Rafal smiles.]

Rafal Rohozinski: Sure. Look, I think, Scott, you've kind of alluded to something which is a huge elephant in the room. We are becoming more technologically interdependent now than we ever have before. And yet this is happening precisely as we see a steep decline in the international rules-based order upon which international peace and security have been built since the Second World War. These two are contradictory tendencies. If we were to hypothesize for a second and imagine what would have happened if President Trump had imposed sanctions on the provision of the Amazon Cloud to Canada rather than on aluminum tariffs as a way of forcing Canada to redress the trade balance, I think we'd be having a very different conversation today than that we are now.

Now, that may be a way of possibility—and I'm not trying to scare anyone by saying that—but I think there's a point here. On one hand, we need to be prepared for global interdependence. But on the other hand, I think we need to be also clearly defining what are necessary sovereign capabilities that we need to maintain and develop. What are those sovereign capabilities, our resilience in terms of infrastructure, whether those are processes that we maintain on a standby basis, that we use only in times of crisis, or maybe this is a more comprehensive industrial policy that recognizes that we need to build industries that perhaps aren't naturally ones which we would develop if we were looking at a completely friction-free global environment. But I think the era of simply hoping and praying that globalization will make things better are not necessarily the approach that we should be taking. On the other hand, autarchy isn't either. So it's a difficult time, but we need to find a balance.

Maithili Mavinkurve: Yeah, that's exactly what I was going to say in terms of the balance part. I think we're going to have to work with our allies in certain areas, but also be quite prepared in our own borders to make sure that we don't have an unhealthy reliance on other countries when it comes to kind of core network or core infrastructure requirements. I think the other aspect is regulations around how our data is treated even when they leave our borders. I think it's going to be important regardless of what technology providers we choose, whether they're foreign providers coming here providing us digital infrastructure, or if it's information that's being dealt by a foreign company or Canadian data that's being dealt by a foreign company or processed elsewhere. We're going to just have to start to define what those guardrails look like. And it's really no different than me as a Canadian citizen, I have certain protections when I leave. Well, it depends on which countries you go to, but there are certain protections that the Canadian government offers me when I travel. So we have to have these types of concepts really transfer into the digital realm. And I think there is a lot of opportunity there because some of it is undefined. I think that is partly what we are trying to achieve here and understand what we need to do when we move into the digital realm.

Scott Jones: Great. I think one of the things that has been mentioned a few times is the importance of that connected infrastructure, especially with that growing. And the pressure might not be here to adopt it in, say, some of the smaller cities, but certainly in our bigger cities, the growth of both 5G connected networks, high speed, high resiliency, low latency, but also the growth of the network of sensors that are going to be out is something that you both talked about, not just from the data perspective, but also from the way we need to live our lives.

How dependent do you see improving cybersecurity is on the actual physical infrastructure that comes with it and the resiliency requirements? One of the things that—I use the example when I was speaking with the electrical sector once, when I said in Ottawa here, we were incredibly appreciative of Hydro Ottawa when we had ice storms and we saw lines tipped over and we saw those crews out there and they were working hard and we were all super appreciative—or the tornado was a better example. By day 3, people were outraged that why wasn't there electricity working even though the substation had been completely destroyed. And we certainly don't like paying for our electricity. We balk at every penny we spend. So, how dependent is that physical infrastructure? And how do we start to show what value investment brings into that? Because it is something that is hidden. We just expect it to work. And I think with cybersecurity, I fear we're falling to that exact same trap. We just expect it all to work.

[Scott pauses, then laughs.]

That perhaps was a little too soapboxy.

[Maithili and Rafal laugh.]

Rafal Rohozinski: Well, maybe I can start by paying you a compliment, and that's to say that thank God that we actually have a Canada Centre for Cybersecurity. Because up until a few years ago, it's like we had an airline industry without an air traffic control system. And I think the fact that we've actually created a centre of excellence is hugely important. I say that partially in jest, but I think also there's a more serious point here. Product liability, consumer protection, the obligation of vendors to put privacy first, security first as an architecture for developing has not been there. This industry developed largely in the wild. Imagine building a car where instead of a seatbelt and air bag, you had your martini shaker and a roller for your doobies. And in some respect, that's what you get when you get a lot of products from the IT sector. Not because engineers are bad, but because they wanted to make it work, because they wanted to make it compelling for consumers to use. That has to change.

I think starting with using consumer protection laws as a means of compelling the values that we want technology to have is a good starting point, because that's one way of using the legislative process as a means of creating both incentives as well as consequences. At the same time, creating standard-setting bodies that can both advise industry, provide recommendations and act as a single point of contact so that people know why they should care about security is really important and that, Scott, comes back to the Canada Centre for Cybersecurity.

Maithili Mavinkurve: Yeah, and I was going to say one thing that struck me actually in this conversation— we've obviously talked a lot about AI, but I realized that we didn't touch enough on the hardware side. And I'm not a hardware expert, but all of this is running on, as you said, physical infrastructure. And I think it's interesting that when we are setting some of these standards, we may be losing sight a little bit of the hardware aspect. I'll touch on at least a little bit of the work that I'm doing with the Standards Council. We're looking at the actual commercialization of AI solutions when the rubber hits the road. What does that value chain look like? What are the risks and monitoring requirements of that entire value chain of data? But equally important is what is that physical infrastructure that is supporting all of that, whether it's making sure that I can get Wi-Fi wherever I go when I go up north, but also when I'm using different services, where is that data going? Do I have the proper rules and regulations?

I liked what Rafal said around consumer protection. That's such a critical element of it. But it's also in the hardware quality. It's the classic ISO standards around quality management. These things have to be sort of upgraded a little bit, tweaked a little bit in this new era. So there's a lot of work that needs to be done across the board. And probably engineering needs a bit of an upheaval as well. I remember I did an ethics course in engineering way, way long ago, but these things have to come into play in all disciplines, I think.

Scott Jones: Thanks for that. I was thinking about when I did my engineering ethics course as well, but I also did a computer science degree where there's not a course on ethics.

Maithili Mavinkurve: Exactly

Scott Jones: There's not a course on law and engineering or law and computer science. It was programming and algorithms and techniques and things like that. So it's an interesting model. I think one of the other aspects here is that, Rafal you mentioned earlier that a lot of our technology, at least the hardware, is coming from outside. Nortel, unfortunately, as a telecommunications engineer, is no longer. We see its remnants around Ottawa, certainly, but a lot of our technology comes from around the world, especially if we're talking about telecommunications, we're talking about 5G networks. No matter which suppliers are chosen and what decisions are made, the one thing we know is it is being physically installed here, but it's built somewhere else. Some of the software is certainly done here, though. There's a lot of expertise in Canada on this front. So how does Canada work to make sure that our data is secure and kept in a secure way when a lot of our infrastructure and hardware come from other jurisdictions that could be anywhere in the world, and certainly assembled and built anywhere in the world, including software?

Rafal Rohozinski: Well, I think it's important to recognize that these technologies that we're talking about have become the commanding heights of the future economy. And as a result, this is strategic territory where we will have both cooperation and competition. These technologies will be and are being seen by great powers, whether they are technological or whether they are great powers in the military sense as areas where they want to dominate. And they are going to be areas where they're going to employ state power in order to ensure that they can project their will into other countries. I mean, this is the reason why CSC exists, the NSA exists, and there are analogous structures around the world. I think the only way that we can strike that balance, Scott, is by recognizing that there's a legitimate national security concern in terms of how technologies are adopted and used. Not to simply close our eyes and say, "Well, you know, that's a deep state perspective. Industry will solve it all." We can't just be naive about this.

The way that we maybe get around the problem that we are 37 million people in a country stretched across the world's surface probably more vulnerable to cyber disturbances than most is why working in alliances and those alliances over the last 45 years have been principally the Five Eyes, more broadly NATO, and more broadly than that, the sort of G20 and trade relationships and relationships of confidence that we've created. It's also been our investment in the international rules-based system that's allowed us to have confidence in it. That's going to be a lot more difficult now, and I think we need to recognize that. But at the same time, that doesn't mean we shouldn't abandon it. And I think we need to be looking at, again, both sovereign national capabilities, recognizing that this does have a national security dimension with working with alliances and more broadly creating confidence at the global level to address this at multiple levels. Thanks.

Maithili Mavinkurve: I don't know if there's a whole lot more I can add to that. We have to be cautious. We absolutely do. And I think most people within industry will say the same thing with respect to critical infrastructure such as 5G. In my mind it seems like, not to oversimplify it, but we need to stick to our allies. There are a lot of options out there. There's not just one or two, or maybe there is. But we need to stick to our allies and we need to be hyper vigilant when it comes to this type of technology. Everything is going online. Our entire mode of communication is going to be digital. And we have to have our eyes wide open with respect to physical devices and infrastructure that is connected at all times. We have to be cautious. And we're seeing this across the world in terms of how 5G deployments are going. I don't think we should be hasty. It's OK to take a pause, make sure we're making the right decisions before we proceed.

Scott Jones: Great. I think one of the challenges we face, and mentioned it before about the martini shaker, I used an example, I think, in a panel where I was doing some remarks before a panel you were on before Rafal, which was that the cybersecurity equivalent is you can do such a catastrophic—one click and you can destroy your life or impact your company in a material way. Might be a company-ending or a business-ending event type of thing. And the equivalent in the car would be that there's a button that you don't know if you push the wrong—you push it and suddenly your tires fly off and your engine burst into flames. And so we talked about industry standards, and I'll get away from the hyperbole and the metaphor here. But that is something that we have started to see. We've started to see some business leaders and some leaders in industry, particularly in mobile communication, talk about privacy more, place more emphasis on giving consumers choice, where you can opt to be in the free ecosystem where you get a lot of data. And now there's talk about, no, there's a pay ecosystem now where you get better control, but you now have to pay for it versus some of the others. I'm wondering, how do you see that developing over the next few years in terms of data protection? But also, is that possibly a way to gain some of that consent that, Mai, you talked about earlier in terms of user consent, user acceptance of their use of their data?

Maithili Mavinkurve: Yeah, I think—just super quick to comment—I think businesses, enterprises in technology are going to see disruption in business models. Absolutely. I think it's just a matter of time. I could be wrong, but I do think that there's a lot more pressure from consumers around how that data is going to be used. And currently we think, "Oh, well, it's a great convenience. We're getting all these free services." And I think there's still obviously a large education that has to happen. But it's going to be a combination of consumer education, literacy, knowledge, consumers fighting back around their own rights as well as regulations around what are those guardrails to protect the consumer. So the combination of those at some point are going to flip the switch with respect to how do we make money. I mean, as a business we obviously still need to continue to make money, but we're going to have to get a lot more creative around how that happens. And I don't think we're there yet. Right now, it's the freemium versus the paid model, and that seems to be the norm, so to speak. But I think we will start to see a shift which involves more of the user in the mix in some sort of informed way.

Rafal Rohozinski: Yeah, I'll give you a contradictory answer in the sense that I'm going to contradict myself. As I said before, I think just the simple reality of the mass employment of 5 Key Technologies is going to mean that we're going to have a whole different data economy than they have right now, which is going to be infinitely more complex than simply: do I give consent or not to use a particular platform? Do I agree to pay? But I think there's also a fundamental problem here that we need to be aware of, it's just over the horizon. And that is why while it might be beneficial to use a privacy-first approach and start saying that maybe these natural monopolies that have emerged, whether it's the Googles and the Facebooks, have to be broken down, at least what we've been able to benefit from is a globally accessible commons. We may have digital divides in terms of people that have access to technology or not access to technology, skills or not skills, but Facebook is free. So is Google.

What happens when we start creating closed ecosystems where all of a sudden, socioeconomic differentiation is going to mean that I have different access than you do that my ability to benefit from is different? Now, that's coming. There's a good reason why we had the Enclosure Act centuries ago that created wealth for landholders in the United Kingdom. And I'm sure that in the richness of the digital territory, we are going to start seeing fiefdoms carved out in different ways. But I think as a society, going back to the ethics, our societal values, we need to be conscious of the fact that if we want to preserve this as a commons for Canadians, then we need to be conscious of the fact that our data universe is not going to be a singular model which is privacy-first or open commons for everyone, but it's going to have to take into account these things to preserve and prevent new digital divides from coming into being.

Scott Jones: I think with that, we're getting very close to our time. I would just like to thank you both. This has been a fascinating discussion. I know we live and breathe these concepts every single day. I'm hoping those of you in the audience have found it very useful. I certainly thank you for your great questions. It's helped us shape this discussion and I tried to pull as many of them as we possibly could. But Mai and Rafal, this has been a terrific discussion, clearly leading a lot of Canada's thought on these topics and pushing forward. I really thank you for your time and sharing your thoughts and your openness with this group of public servants. I think the point that you both made about needing to work together and think through for Canada how do we work together breaking down some of those barriers that belong in the past, whether that be our departmental silos or cylinders of excellence, but also the barriers between industry and the public and private sectors, and academia as well.

So, thank you both for taking your time out of what I know are busy, busy days. And I appreciate that. We were joking at the beginning that none of us have had either our dogs, cats or our kids jump into this video, so we did well for this time.

[Maithili and Rafal nod, smiling.]

For all of you, the next event in the New Economy series will be held on December 1st, and it will focus on the need for standards and governance and a thriving modern economy. I think you will have felt that's been teased out quite a bit in this area, so there's a lot of discussion to come in this next session. I'd like to thank all of you for participating. And I wish you all a great day, and again, thanks to our terrific panellists and for Aaron for pulling this together. Bye now.

[Maithili waves and the Zoom call fades out. White text on a purple background reads, "Thank you for learning with us today. Follow us on Twitter @School_GC." The screen rolls like a film reel, then text reads, "Merci d'avoir appris avec nous aujourd'hui. Suivez-nous sur Twitter @Ecole_GC." The screen rolls again and the animated white Canada School of Public Service logo appears. Its pages turn, closing it like a book. A maple leaf appears in the middle of the book that also resembles a flag with curvy lines beneath. The government of Canada Wordmark appears: the word "Canada" with a small Canadian flag waving over the final "a." The screen fades to black.]

Related links


Date modified: