Opinion: The UK Government response to the DCMS report

Opinion: The UK Government response to the DCMS report

At the end of this summer, the UK House of Commons DCMS Select Committee on Fake News published an interim report. It was a wide-ranging piece of work written in the midst of an equally broad investigation that produced headlines about Facebook, Cambridge Analytica, Alexander Nix, Arron Banks, and Russia. The report covered the definition of disinformation, the role of tech companies, Facebook and Cambridge Analytica allegations, Political campaigning, Russian influence on elections, and digital literacy.

MP Damian Collins’ Committee made 53 recommendations to the Digital, Culture, Media and Sport Department. The Government, as it is obliged to do, responded to the recommendations with a report published last month, however brief.

Almost immediately, Committee members expressed their disappointment. Just three of the 53 Committee recommendations were accepted. Four were rejected outright. Nine recommendations were ignored. And in testimony to the Committee, the newly-appointed DCMS Department Secretary Jeremy Wright made the case that most of the Committee recommendations ask that matters be considered, and so the Government promised they would, indeed, eventually, consider them.

Funny as that sounds, the Committee’s report is no laughing matter. It warns of a “crisis in our democracy” created by disinformation campaigns and hate messaging, and expresses deep fears over Russian interference in UK and other Western elections.

I’m the Specialist Advisor to the Committee, and I, too, was disappointed by the Government’s response, especially given the timing. Just this week Leave-financier Arron Banks has been referred to the National Crime Agency by the Electoral Commission - something the Committee recommended, yet the Government refused to respond to. The Daily Mail is reporting that Theresa May, as Home Secretary, stepped in to stop an investigation of Banks during the Brexit Referendum campaign. OFCOM in testimony this week continues to refuse to pull the UK license of Russia Today — aka RT — a Kremlin-financed propaganda broadcast network. And just yesterday, the founder and editor of Far Right Watch has claimed confirmation that Nigel Farage is being actively investigated by the FBI. What a week.

There are several points in the Government’s response I find particularly disappointing, and they are worth exploring.

A few of these can be summed up succinctly. The Government is dragging its feet while Democracy burns, deflecting calls for criminal investigations, claiming they’ve found no “successful use of disinformation”, and kicking the can down the road until after Brexit. Most of the Government’s response has simply flagged a number of ongoing consultations and reviews — the “Protecting the Debate: Intimidating, Influence and Information” consultation, for example — and promising to consider them at a future date.

These issues demand a greater sense of urgency from all parties. Given the broad assaults on democracy across Western Europe and North America, by both foreign and domestic actors, I believe the damage being done will take decades to repair. How many tainted election cycles will pass before the Government mounts an effective strategy to counter assaults on our Western pluralistic consensus?

Further, the Government’s response liberally quotes a line that has been trotted out over the past six months — they’ve observed no “evidence of the successful use of disinformation by foreign actors, including Russia, to influence UK democratic processes”. The success of these operations is beside the point. Any attempts by foreign and malign actors to influence UK democratic processes is hostile by nature, and its success or failure does not change this.

If you try to kill your neighbour, but you weren’t quite up to the task, it’s still attempted murder.

Finally, in the face of the Government’s pale response to the Interim Report, it’s worth noting a recent motion for a resolution passed by an EU body — the EU Committee on Civil Liberties, Justice and Home Affairs. They’ve produced a bold document, calling for criminal investigations, sweeping data legislation reforms, and perhaps most important, a full data audit of Facebook and the big social media networks. I agree with these calls; if you’re planning to regulate the social networks — and believe me, regulation is coming — it’s good to start with understanding the ground upon which you stand.

The problems facing democracy are acute. And it is clear there are several sides in this argument. On one of those sides stands democracy and the civil society; on the other is the fracturing of our democratic culture through malign interference from Russia. Solving this will require a team effort — from Europe and America, from the left and the right, and from the Government and Parliament. All politics aside, it is critical for each one of us to recognise the current crisis as precisely that — a crisis of Democracy — and to make it unambiguously clear which team we’re actually on.

Dr Charles Kriel is Corsham Institute’s Data, Ethics and Trust Research Fellow, and Specialist Advisor to the DCMS Select Committee on Fake News and Disinformation.

Scoping a community approach to young people’s digital skills

Scoping a community approach to young people’s digital skills

When it comes to young people’s digital skills and online safety there’s such a wealth of organisations and resources available that it can be difficult to know where to start. We know we want to work together with our local community to enhance the digital literacy, safety and creativity of young people – but how do we do this effectively?

On Monday October 22nd we hosted a community focus group to share ideas for our new Children and Young People programme. We had parents, educators and community professionals around the table to discuss opportunities and ideas from all angles.

We began the session by setting the scene, sharing some of the findings from our research and outlining our proposed themes and strands: education, safety, and creativity…

  • We believe all young people should receive an education in computing and so we want to support local schools and teachers in delivering the computing curriculum using resources from Barefoot and Computing at School. 

  • We want young people to understand the safest ways to interact in the digital environment so we want to link with schools and community groups to share online safety information from expert organisations such as ParentZone and South West Grid for Learning. 

  • Our research to date shows that giving young people the opportunity to use technology in a creative way will engage and enthuse them so we want to create extra-curricular opportunities, for example a Code Club, so young people can experiment with technology in new ways. 

 After presenting our findings and ideas we asked attendees to talk in groups and prioritise which initiatives we should run for teachers and pupils in local schools, parents and carers in the home and the everyone in the local community. It was agreed that we should work in partnership with local established groups to deliver information and activities.

IMG_2187.jpg

We discussed a wide variety of possibilities, including messaging through established brands and campaigns such as Microsoft and Safer Internet Day. We discussed the power of social media influencers and the potential to connect with those such as Zoella to act as role models for young people’s online behaviour. We shared ideas for projects such as the development of a young people’s digital festival in partnership with other community organisations such as Corsham Youth Zone, Pound Arts and Springfield Campus. This could be held to coincide with Safer Internet Day 2019 and include workshops for game-making, code-breaking, and animation.

Such an approach would see Corsham Institute supporting community hubs to host events which allow young people to interact with technology in a practical and creative way to highlight the positives of the digital world.

IMG_2170.jpg

The research we’ve carried out so far shows that young people are keen to have opportunities to interact in a more creative way with technology so it’s exciting to be able to offer these experiences. We will work with the community to develop projects we can rollout in partnership, ensuring they are user designed so that they work for the young people in the area and give them the chance to engage in new and inspiring ways with digital technology. We aim to empower them, and those around them, to use digital technology in positive and creative ways. If you would like to get involved or support this programme please contact us via info@corshaminstitute.org

Corsham Institute at the United Nations General Assembly

Corsham Institute at the United Nations General Assembly

By Charles Kriel, Research Fellow – Data, Ethics and Trust

Last week the United Nations wrapped up two weeks of high-level general debate, talks and panels at their 73rd General Assembly. For two weeks government, private industry and civil society leaders, along with experts, activists and celebrities discussed the greatest challenges of our times — equality, poverty, climate change and peace, among a range of other topics.

World leaders guffawed at Donald Trump’s speech. The organisation UN Women presented their 2030 gender equality agenda.  And heads-of-state addressed (or clearly didn’t) this year’s theme of ”Making the United Nations Relevant to All People: Global Leadership and Shared Responsibilities for Peaceful, Equitable and Sustainable Societies”.

I was fortunate to have been invited to represent “Corsham Institute at the “New Partnerships for Countering Violent Extremist Narratives” event, hosted by the Ministry of Foreign Affairs of Denmark. Across what was in fact a full-blown mini-conference, Countering Violent Extremism leaders from around the globe spoke of their work, their hopes, and their programmes.

Countering Violent Extremism, or CVE, is a form of counter-terrorism placing an emphasis on crafting positive narratives about the civil society. At its best, it is a community-driven social media practice, reaching out to vulnerable individuals whose need for social inclusion is so great, they might be pressured into acts of violence against their fellow citizens. CVE is media heavy and military light, and often best practiced by charities and NGOs working in concert with government organisations.

Everything from online disinformation (or fake news) through digital inclusivity has a role to play in countering extemist narratives. Just as it takes an entire village to raise a child, it also takes an entire community to prevent the online radicalisation of that child grown into an adult.

A few highlights:

  • The minister of Danish Foreign Affairs, Anders Samuelsen, opened the sessions by reminding the audience that terrorist attacks have decreased for the third year in a row. But he also noted that although Da’esh is losing the fight in Syria and Iraq, they are thriving in “cyberspace”.

  • Lebanese National PVE Coordinator Rubina Abu Zeinab pointed out that every action must be taken to increase trust by the community.

  • While pointing out that the internal culture of big platforms regarding terrorist content had changed in the last decade, William McCants, Public Policy Manager for Google, claimed that offline communication was far more important to radicalisation than online — that online “only makes a contribution”.

  • Farah Souhail of London’s ZINC Network felt that dehumanisation was the leading issue, and that her StratCom agency emphasised the humanisation of the terrorist victim. She also noted that in Tunisia, where children have little access to traditional media but easy access to online content, their main entertainment is often waiting for the next Da’esh video, fantasising its content with their friends.

  • Erin Saltman of Facebook again emphasised the importance of offline contact. And like the Google representative before her, she shifted much of the responsibility for online distribution of radical material onto the shoulders of smaller platforms with less staff, pointing out that Facebook employed ten thousand content reviewers.

  • Alexander Guittard, Director of Governance and StratCom of M&C Saatchi made a strong point of emphasising the fluidity between online and offline media. “There is no distinction,” he claimed, pointing out that users easily move between worlds. And,

  • Alexander Ritzmann of the EU’s Radicalisation Awareness Network noted that the private sector could be used to help the civil society, but it must be directed. He also made a call for more programmes empowering citizens and communities to speak up and express their own civil society narratives.

The depth and breadth of knowledge from the speakers was impressive. But in these sessions and others, I also found a clear divide between civil society organisations, and those charged with representing the major platforms — Facebook, Google, etc.

I found their claims around online and offline particularly disingenuous. While almost everyone agrees that offline contact is important, that is a step that often doesn’t occur until very late in the radicalisation process, when a groomer seeks to move their target to violent action. Meantime these vulnerable individuals — potential terrorists — exist within an online community of individuals with radicalised views, fed daily by online content.

Further, transferring responsibility for online radicalisation to small platforms ignores many of the facts of online extremist materials. Smaller platforms like the oft-cited JustPaste.it do indeed find themselves hosting extreme content - violent and dehumanising. But radicalisation benefits from more seductive work, promising lands of plenty, brotherhood, and a place where a vulnerable young man might belong.

And even one hundred thousand content reviewers would be a drop-in-the-bucket for YouTube or Facebook, particularly when they’re also responsible for policing porn and spam.

Finally, Alexander Ritzmann’s points cannot be emphasised enough. Counter-terror narratives need to be delivered by members of the community, telling their own stories. They should be told by people that we trust, and shouldn’t just appear to be genuine, but be genuinely authentic.

It was an enormous honour to be invited to this year’s UN General Assembly. And while our “Countering Violent Extremism” podcast may now have expanded to the larger issues of “Data, Ethics and Trust”, our emphasis is on exploring how technology impacts communities. Understanding the workings of online media, technically, socially and psychologically, remains vital in our quest as a civil society organisation to promote the narratives of peace and democracy. Online or off.

“Hey Alexa, stop sharing my data!”

“Hey Alexa, stop sharing my data!”

By Martin Head, Programme Director, Communities and Director of Content

Our homes are becoming ever more connected, with ownership of smart devices more than doubling in the last two years. This data, from a report earlier this year from PwC(1), reveals that almost 40% of people now enter the connected home market via smart entertainment devices.

The PwC survey estimates that the market in 2019 for smart devices will be 10.8bn, whilst techUK(2) have recently found that the use of smart speakers has doubled from 2017 to 2018 and that households owning more than three smart home products have grown by a quarter since 2017.

As Ronan O’Regan, PwC’s Digital Utilities lead said, as their report was launched,

“While smart home assistants are relatively new to the market, we believe they could potentially be the ‘glue’ towards wider adoption. You could say they are having an ‘iPhone effect’ in the market.”(3)

The development of connected technology is still in its infancy with some devices offering solutions seemingly still in search of a problem (Bluetooth kettles anyone?). The real scope and ultimate power of connecting our homes in an integrated way is still a long way from being realised.

There are huge opportunities in how we might use the technology to support and protect people, however, these devices generate vast quantities of personal data – a fact that may be misunderstood by the users. Therefore, a deeper understanding of data privacy and an ability to develop a trusted relationship with the providers and the uses of the technology is needed. As techUK’s Sue Daley wrote recently in a piece for Corsham Institute’s Observatory for a Connected Society app in regard to AI and ethics, “It is our job to continue to build the culture of data trust and confidence needed to ensure technology remains a force for good”(4).

The relevance and power of all the personal data gathered from a connected home when numerous devices are integrated together is going to take on new dimensions at an exponential rate. PwC in launching their recent report said that, “Tech giants are blurring lines and breaking down barriers, creating innovative products that capture data to provide differentiating insights, novel solutions and a seamless user experience”, but they recognised that trust in suppliers “could become a major battleground for traditional players over the next few years”(3).

For most consumers the journey to understand the meaning and full implications of sharing their personal data is only just beginning. Corsham Institute’s Your Data, Your Rights survey earlier this year showed that while 60% of respondents cared a lot about the use of their personal data, only 18% knew a lot about its collection(5), and the recent techUK ‘Connected Homes’ report shows that for 23% of consumers personal privacy was the second highest barrier to buying connected home products (after the cost).(2)

Further, while there is an official definition of personal data in the 2018 Data Protection Act as “any information relating to an identified or identifiable living individual… particular by reference to, (a) an identifier such as a name, an identification number, location data or an online identifier, or (b) one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of the individual”(6), what we’re prepared to share isn’t a fixed concept, it will differ between age groups, other demographics and for the value we perceive we are receiving from each device or app.

In a recent workshop held at the Digital Catapult, by a multi-university Petras funded project into the ‘Value of Personal Data in the Internet of Things’(7), the ‘privacy paradox’ was highlighted as the tension between individuals’ stated desire to maintain privacy, set against their willingness to share their data online, while it was recognised that ‘little is understood about the value consumers place on keeping their data private’. Their work to date includes findings that individuals perceive a lack of choice about whether to share their personal data, care deeply about protecting it and are even willing to pay to do so.

The debate over data ethics is increasingly fundamental, and ever more urgent as homes become more connected, when devices could be listening to our everyday conversations and making decisions about our preferences, opinions and shopping habits(8). It’s not simply about finding more appropriate business models to use the data, nor that people should have to pay for the protection of the data about themselves, it’s that there is an urgent need for a culture of data to develop as rapidly as the technology is doing and to grow hand in hand with it.

Individual citizens need to own and control third party access to their personal data and further still, when access has been granted, have choices on how it is used on an ongoing basis, and for that permission to be able to be withdrawn if circumstances change. This is an area that Corsham Institute will be working with partners to develop ethical frameworks and community-led test beds to understand in greater depth the implications for all of us, as our homes become ever more connected.

If Alexa and Cortana are to become invited guests into our living rooms or even virtual members of the family, they must end up serving the real interests of their owners and work for us, and not use their all too attractive functionality as a cover story for a massive data mining exercise by those who market them.

—————————————————

(1) PWC Disrupting Utilities: www.pwc.co.uk/industries/power-utilities/insights/energy2020/connected-home.html

(2) techUK Report: State of the Connected Home, Edition Two, September 2018 www.techuk.org/insights/news/item/13914-connected-home-device-ownership-up-but-consumers-remain-sceptical

(3) www.pwc.co.uk/press-room/press-releases/UKs-appetite-for-connected-homes-grows.html

(4) Excerpt from blog by Sue Daley, Head of Programme, Cloud, Data, Analytics and AI at techUK: ‘AI, ethics and data: a watershed moment for UK tech’, written for the Observatory for a Connected Society: www.connectedobservatory.net/commentblog/sue-daley

(5) www.corshaminstitute.org/ydyr-survey-results

(6) Data Protection Act 2018, Chapter 12, Part 1:3(2) www.legislation.gov.uk/ukpga/2018/12/enacted

(7) www.petrashub.org/portfolio-item/value-for-personal-data-in-iot-vpd/

(8) www.theweek.co.uk/93869/is-amazon-alexa-listening-to-me-all-the-time

New skills and fresh perspectives

New skills and fresh perspectives

Looking ahead to our 2018/ 2019 Apprenticeship Programme

It is widely recognised that the education system in the UK is struggling to keep pace with our changing digital world. Young people are seeking, but are not always able to find, learning opportunities that develop their technical skills alongside ‘human’ skills like creativity and problem-solving. 

Corsham Institute recently supported the Techie Awards at Hartham Park and heard from employers across the South West region, who said they had been struggling to find new talent. This skills gap, for young people who can’t find appropriate learning pathways and employers who can’t find the right skills, could potentially be plugged with the help of apprenticeships.

Research by The Sutton Trust this year found that, 2 in 3 young people would be interested in doing an apprenticeship. Despite this, just 41% say their teachers have discussed apprenticeships with them at school. 24% of secondary school teachers think there are enough apprenticeship opportunities at A-level. However, just 1 in 5 of them are willing to recommend these opportunities to their highest attaining pupils. 64% of teachers would rarely or never advise an apprenticeship.

With the high fees of university education, young people are considering alternative routes to avoid the costs and debts. The ‘earn whilst you learn’ concept is a big incentive for most people pursuing an apprenticeship and the government’s 2020 target of 3 million apprenticeships in England hopes to give young people the alternative they’ve been searching for.

Here at Corsham Institute, we take great pride in our Apprentices. Our Apprenticeship Programme has been running since September 2015 and so far we’ve welcomed four young people through our door. They’ve all been a key asset to our team and have provided us with new skills, fresh perspectives and an opportunity to grow local talent.  

In September 2017, we were pleased to take on Sam and Kara as Creative & Digital Media Apprentices. For the last year, they’ve been a core part of our Creative Team and have produced many high-quality pieces of work around digital marketing and creative content. 

Kara, who has now completed her apprenticeship, said “I have really enjoyed my time as an apprentice at Corsham Institute. I was given the opportunity to work on a variety of projects including filming and editing a TEDx talk, photographing and producing motion graphics for an event at the House of Lords, designing infographics for Safer Internet Day and producing fundraising videos. Another valuable opportunity was the time given to us to pursue an independent project, which allowed me to build the coding and design skills I will need in my next job. My time as an apprentice has given me the chance to build technical skills as well as softer office skills. All of this will continue to benefit me in the jobs I have in the future.”

SamShoot.jpg

Sam, who has also now completed his qualification, said “My time at Corsham Institute has been completely rewarding, and very enjoyable. There have been so many opportunities, which I haven’t taken for granted. Some of my favourite tasks include shooting a video in Cambridge at RAND Europe’s main office, helping to run social media at the launch of an app at the House of Lords, and shooting and editing a video in Bristol at We the Curious. An apprenticeship, in my opinion, is a fantastic step in the right direction, and has given me the correct skills and workplace etiquette to aide my professional progressionWith an apprenticeship, you really do get out what you put in.”

Both Sam and Kara have now taken the next steps on their professional journeys. We are very proud of their achievements with us and we wish them all the best for the future. Having passionate young people in our team ensures we have a diverse organisation, with a range of perspectives. We look forward to welcoming two new apprentices in October as they begin their journeys. 

Our Apprentice's independent projects

Our Apprentice's independent projects

This year, we asked our Apprentices, Sam and Kara, to produce a piece of creative content aligned to the field of work they were most interested in at the Corsham Institute. The purpose of this project was to allow our apprentices the chance to develop skills which would help them move into their desired career paths that they wouldn’t necessarily get to develop as part of their everyday jobs. It was also a chance for them to develop their communication, project management and time management skills. The team around them supported them and provided guidance, but it was down to them to be the driving force behind their chosen projects.  

Kara took interest in our Virtual and Augmented Reality Environment and wanted to explore how technology could be used as an interactive tool for education. Sam was interested in our Children & Young People programme and wanted to explore different online cultures and their effects on young people. Below they write about their experiences and what they learned from them.

 

Kara’s Independent Project

Kara

After seeing the Virtual and Augmented Reality Environment at Corsham Institute, I was interested in how this technology could be used as not just a means for entertainment but also for education. I wanted to create something that could be used to test player’s knowledge in an engaging way. Creating this would also allow me to learn the fundamental skills behind creating game content such as using game engines and basic coding. 

I began by researching education games, looking into methods of learning and looking at the mechanics of game design. I also researched the industry standard game engine and coding languages. Through this I decided to learn with the game design software, Unity, and code using C#. From here I completed several Unity tutorials to get to grips with the Unity game engine and C# coding. This allowed me to build up the basic knowledge and terminology I would need to complete my project.  I began to design a simple question and answer game so that I could ensure I learnt the fundamentals of game creation. As I began to create the game, I realised that writing the code to allow interaction was going to be a lot more challenging than I initially thought. I spent a lot of time reading through forums and troubleshooting areas of coding as this was a whole new language to learn and get to grips with. In the end, this proved very effective as I began to recognise the errors I was making and work out how to fix my mistakes. 

I managed to create two basics game levels, one with colour matching and one revolving around answering a question. Through this project I’ve learnt a great deal around game design, mechanics and the work that goes on behind the scenes of a game. These skills are all going to be very helpful for me in my new job as a Junior Interactive Media Technician, where I will be working with code, game design and with virtual and augmented reality. On a personal level, I enjoyed the challenge of learning the basics of game creation and seeing these things replicated in much more complex ways in the games I play myself.

Below is a screen recording of Kara’s game, which she designed with the potential to be converted to VR.

 

 

Sam’s Independent Project

Sam

For my independent project, I produced a documentary called ‘The Influence of Influencers’, an insightful look into the world of YouTubers and their content. I wanted to do a documentary on the subject of YouTubers because, as a consumer of YouTube content, it’s something that really interests me. I also thought it relates well to the Corsham Institute Children and Young People programme, which aims to understand the risks and benefits of digital technology for young people and support them in managing those risks and embracing the benefits. YouTube is currently one of the most popular video platforms used by young people and I see my documentary as something that will, hopefully, provide a bit of insight to both parents and educators about YouTube as a platform and some of the positive and negative aspects of it. 

To produce this documentary, I firstly wrote a script, which included some context and recent news about the platform, which allowed me to visualise what route the documentary was going to take. I then arranged and recorded interviews with two YouTubers, Elena and Jake, each of whom have very different styles. After which, I gathered clips from a wide variety of sources, to support the narrative of the project, and recorded the narration. The editing side of the documentary was the most time consuming, but also the most rewarding. 

Prior to commencing the documentary, I fully immersed myself and researched extensively to become an expert in the subject, which allowed me to be confident when interviewing Elena and Jake, but also when writing the script, it helped me to choose what key points to include. I have learnt a variety of different skills from doing the independent project. I have developed my video editing skills further, having no experience or knowledge of how documentaries are made. Also, I have learnt how to voice-over for video, a skill which I was keen to know, and find very interesting. It has also helped me improve my interviewing technique too, an attribute that I am very lucky to have learnt. 

Overall, this project has been very useful, and I have found the documentary creation process really interesting. I have enjoyed producing ‘The Influence of Influencers’, and I hope those who watch it will learn something new and become aware of just how important online influencers are for young people.

Below is Sam’s short documentary, ‘The Influence of Influencers’.

Adoption of public cloud services in the NHS: trust, security and public opinion

Adoption of public cloud services in the NHS: trust, security and public opinion

The benefits of cloud computing for the NHS are significant but public awareness of its use is low. Those are the findings from the latest Ci report: “The Adoption of Public Cloud Services in the NHS: trust, security and public opinion”. Our findings are based on exclusive polling from ComRes, which tested levels of public understanding of data storage options within the NHS and confidence or otherwise in their security, and on interviews with a range of health and care professionals and experts. 

Our survey found:

  • High levels of trust that the NHS is storing patient data securely: 70% of British adults say they are confident that their NHS data is stored securely, while 25% say they are not confident. 
  • Low levels of understanding as to how patient data is currently stored in the NHS, with nearly half of respondents thinking that is stored on a national NHS computer server and only 28% thinking that it is stored on a cloud. 
  • A difference in people’s views between the prospect of their patient data being held by clouds managed by British companies who store data only within the UK (49% of people were comfortable with this) as opposed to being held on clouds manage by global companies (69% of people were uncomfortable with this). 
  • A desire for more information on data storage in the NHS, with 88% of adults saying it was important to know where and how data is stored and 80% saying it is important to know if data is kept outside of the UK. 


Our report looks in detail at recent NHS data handling stories and the current policy and data governance landscape, including the impact of the Cambridge Analytica/Facebook scandal on public trust in data security more widely. We draw out a number of important themes from our research and interviews, including:

  • The importance of emphasising the benefits from the adoption of public cloud in the NHS, including: lower costs (freeing up more money for frontline care); greater safety and security of the data; and the opportunity for better care and innovation.
  • The need to address some significant challenges for the NHS, including: low levels of digital literacy and technical skills; barriers to maximising the potential of cloud computing, including financial impacts if there are long-term contractual tie-ins to big cloud providers; and the risks from the gulf between the low levels of public understanding of the use of cloud computing, particularly when provided by major global tech companies, and the potential impact should a data security breach occur that is linked to a cloud provider. 

Taking the polling and the research together, we conclude that that there should be better engagement with the public to make them aware of the use and benefits of cloud computing in the NHS, and to build their understanding and trust in a way that pre-empts risk, rather than waiting to respond to a security breach or other data handling controversy. We also flag the considerations and trade-offs to be made between choosing a UK-based or global public cloud provider, particularly in relation to data protection and procurement. 

Simplifying and demystifying personal data

Simplifying and demystifying personal data

As with much in the debate about data and tech, losing the jargon, demystifying the regulations and ultimately keeping things simple when it comes to personal data and people’s rights, is always going to be a challenge

But that was the clear call from a workshop that Corsham Institute hosted, (20 June 2018), focussed on our community engagement and life-long learning project, Your Data, Your Rights.

Local people from across the Corsham area came together to discuss the challenges, knowledge gap, the pros and cons of data sharing, and where the balance should be between raising awareness of the threats and promoting the opportunities.

At Corsham Institute, we are following the 5D project design model of Discover, Define, Design, Deliver and Disseminate, with our initial Discovery phase for this project being a wide ranging online survey with local residents, which was carried out in April 2018. Overall the survey showed that while most people lack essential knowledge about their personal data and how organisations collect and use it, they also care a lot about it, are interested in their data rights, and want more information.

YDYR Group working[1].JPG

With our community-focussed ‘Test and Learn’ approach we are actively engaging with groups of people who responded to the survey in the current Define stage of the project, to go deeper into the challenges, and begin to shape some specific outcomes to take further.

One of the strong emerging themes is that while digital challenges may be new and complex, the answers may well be much older and more traditional. Not relying on technical solutions or new apps, but relying on people and communities to respond by stepping forward to find ways to stay safe online.

YDYR workshop group.JPG

Our workshop participants initiated discussions on a wide variety of related topics, including the adoption of a public health approach to data and data rights, to the development of a community ‘digital spirit’ with an e-version of Neighbourhood Watch schemes, where people would look out for the safety of their neighbours.

Such an approach would require a basic ‘data-pack’ on online safety and rights, to inform the community across its demographics, knowledge levels and the skills-base, with consistent content and simple messaging to cut through the complexities. It could then be used by existing community networks, trusted intermediaries, and cascaded through teams of local digital champions to residents.

The possibility of co-designing such a solution is an exciting one, where, after some initial priming of the pump with the tools it needs, a community could take the lead and support itself to become a safer, connected and more cohesive place, where knowledge spreads, the skills needed across people’s lives grow, and the benefits of our digital world are increasingly realised in local ways by local people.

This vision is one Corsham Institute, working with members of our local community, will develop and test further to find models that can be scaled and replicated in other areas. When fully developed, this community-led approach could become an important contribution to research in the life-long learning arena, into the skills we all need for the digital age.

Corsham Institute specialises in research and learning

Corsham Institute specialises in research and learning

Corsham Institute was established to understand the challenges and opportunities provided by our digital world and connected society.

Our vision is to enable a thriving UK society where people have the confidence and ability to adapt and learn in a changing world. The mission of the Institute is to promote lifelong learning to help people to be resilient, confident and versatile. We want to empower people to think critically and creatively, solve problems, and to know who and what to trust. The Institute is committed to driving innovation and developing initiatives that scale and have a measurable and significant impact. 

In response to demand, the Institute will now specialise and focus its activities on research and learning.

As the breadth of our work is changing, Rachel Neaman, CEO, and Maeve Walsh, Director of Policy and Advocacy, will be leaving the organisation. We wish Rachel and Maeve every success in their future careers and thank them for their contribution over the last 12 months. 

Louisa Simons, COO will now lead the organisation and the delivery of our programmes. More news will follow as our work continues.

Social media and screen time – what’s the real story?

Social media and screen time – what’s the real story?

With recent headlines claiming social media safeguards to be “inadequate” and conflicting views on what constitutes “healthy screen time” for young people, how do we separate fact from fiction in this increasingly politicised debate? 

Recent evidence from the Children’s Commissioner and our Safer Internet Day findings show a rise in the number of children with social media profiles below the “restricted” age of 13 and over, exposing them to content and interactions inappropriate for their age. Social media companies must take more responsibility to ensure their users are protected and safe and that age verification systems are doing their job. But this isn’t the whole story, nor is it the complete solution in the long term. 

Ci responded to the Science and Technology Committee’s recent inquiry into the impact of social media and screen-use on young people’s mental health. We recommended that: 

  • Social media companies take greater efforts to verify the age of their users and gain appropriate consent for new and current users. 
  • Social media companies consistently record and report on the nature, volume and outcomes of complaints and reports made within their systems by children and young people. (1)
  • Industry-level initiatives be independently evaluated to understand how long-term reduction in harm and improvement in wellbeing can be achieved. Learning should be shared and applied to promote consensus amongst company policies and initiatives. (2)

It is imperative that Government helps social media companies collaborate and report issues, so there is a better understanding of how to prevent and tackle negative online experiences. We are therefore pleased to support the calls for an independent Internet Commission to take this forward.

However, we believe it is equally important that Government builds up a solid evidence base in this area before next steps are taken. The complexity and pace of technological change and the context of children’s digital lives requires rigorous, nuanced research before we can draw informed conclusions to inform new policies or legislation. So we also asked for more longitudinal research that takes the granularity of children’s lives into full consideration in our response. The content that children and young people access and the context will ultimately determine what has a negative impact on their mental health and wellbeing. Until this is known, the combative dialogue between Government and big tech firms will continue, whilst children’s, parents’, carers’ and educators’ voices are lost. Limits to screen time will not fix this. 

Research shows that young girls are increasingly reporting higher levels of negative impact than boys - this must be interrogated more closely. We recommended that new research focuses on gender as a control factor, so we can get to the root of this problem. It is vital that we understand how the content is tailored differently towards boys and girls and what this means for children’s mental health and wellbeing in the long-term. Much of this could be attributed to societal expectations on girls and women, as opposed to the social media platforms themselves. They are often a conduit for issues that already exist. Parents and carers must be supported to have open conversations with children on self-esteem and confidence to build their resilience and prevent issues from spiralling online. 

We also advocated for programmes that help educators, parents and carers work with children to develop their critical thinking and problem-solving skills. As our world changes at an increasingly fast pace, it is essential that children develop empathy, resilience and creativity in their approach to life; what we term “life skills”. Children and young people will continue to rely on digital technology to build and maintain social relationships, develop professional profiles and participate in our globalised, connected world. We risk excluding children from the benefits that come with increased access to digital technology if we only focus on the negatives. 

We believe Government must look closely at our current education system and determine whether this is fit for purpose. With the never-ending pressure placed on children through rote learning and knowledge-based examinations, they will continue to be ill-prepared for our fast-paced and changeable world. 

We need updated curricula and guidance for educators so they can support children for the changing future of work, with healthy attitudes towards technology and an open, lifelong learning mindset. Children and young people need strong digital media literacy to better prepare them for our fast-paced online world. They will be the inventors of the world’s future technology. They should be equipped to decide what is right for them and supported to grow the behaviours and skills they need to thrive. 

We want to contribute to the much-needed national evidence base and find out how we can support children to have healthier, positive online experiences. We are developing a community-led project to tackle some of these issues in Corsham. Look out for further details in the coming months.

(1) https://youngminds.org.uk/media/2190/pcr144a_social_media_cyberbullying_inquiry_summary_report.pdf

(2) http://eprints.lse.ac.uk/84956/1/Literature%20Review%20Final%20October%202017.pdf p.8 2017 

Personal data: how much do people know or care?

Personal data: how much do people know or care?

At Ci, we are focused on empowering people and building trust. Our recent blog post looked at public attitudes to trust, data and digital rights and set out the plans we had for our Your Data, Your Rights (YDYR) project in Corsham. We talk more about the results from our survey below and what we are planning to do next.

GDPR and the public: do they care?

With the implementation of the General Data Protection Regulation (GDPR) just days away on 25 May, there is still a big gap in engagement and communication with the public on what the GDPR means for people. The Information Commissioner’s Office (ICO) carries extensive material to help businesses comply with the regulation but, to date, there has been nothing focused on the public – although its new Your Data Matters campaign will be launched on the day of GDPR implementation.

Following the Facebook/Cambridge Analytica scandal, earlier public engagement around data protection, privacy and ownership now feels like a missed opportunity. We carried out our Your Data, Your Rights survey shortly after the news broke and asked our respondents some questions about its impact on their attitudes: 

  • 80% said that the events had made them think more about their data and what they share online
  • 40% said it had changed the way they feel about organisations having access to their data ‘a lot’, with a higher impact on the over-65s (60% answering ‘a lot’) than the 16-25s (only 10%). 

In response to our other survey questions, there was also a clear demand for more information on how people could use their new digital rights. 

In stage 2 of our project, we want to address this demand: working with the local community to develop their understanding of personal data, and how they control, share and protect it. You can read more about our project, our approach and the local survey results by clicking the button. 

Below we explore some of the more notable findings which either suggest differing levels of knowledge and awareness between different demographic groups, or where the local and national comparison is significant.

Also, to give our local survey some context, we have pulled together more of the recent national survey findings on people’s attitudes to data sharing in the slideshow below.

Understanding of personal data

We started our survey with some questions about people’s existing levels of knowledge on data and found that many people are unsure about the essentials. When given four definitions of ‘personal data’ to choose from, including the accepted ICO definition, 48% of respondents either selected the wrong response or admitted they didn’t know. Knowledge was poorest among the over-65s; and, in response to a question inviting respondents to identify all the types of data they would consider to be ‘personal’ (such as date of birth, mobile phone records, health data etc.), fewer of the younger generation (16-25 year olds) correctly classed their political opinions, genetic data and their name as personal data.

In terms of knowledge about how much data was collected about them, only 18% of our respondents overall said they knew a lot about the collection of their data, similar to the 20% of people in the Eurobarometer survey who felt they are ‘always informed about data collection and the way data are used’. Seventeen percent of our Corsham respondents said they knew nothing at all about what their data might be used for. 

But, for Corsham residents, the collection of their data was important to them: 60% of respondents said they care a lot about what organisations might use their data for (rising to a staggering 87% among over-65s), while only 3% said they didn’t care at all and 4% said they hadn’t thought about it before. Overall, however, these figures are much lower than the 94% of respondents to doteveryone’s digital attitudes survey who said it was important to know how their data is used.

 

Attitudes towards data use

Our survey showed that people had different attitudes to their data being used for different purposes. People report being most happy for organisations to use their data:

  • To comply with legal requirements (75%)
  • To provide the product/service they want (64%)
  • To help the NHS (62%, a figure far higher than a UK survey for the ODI would suggest, where 47% of respondents would share medical data about themselves if it helped develop new medicines and treatment); and
  • For local councils to provide better services (50%; which contrasts with the ODI’s survey that found only 26% were willing to share their data if it helped identify which new public services should be funded). 

Our respondents were least happy with organisations selling their data for profit (only 1% indicated they were happy with use of data for this purpose), to receive marketing/advertising (6%) and to share with partners for services (7%).

 

Attitudes towards control of data

In our survey, only 2% of respondents said they felt they had full control of their data but, when asked how much control they would want, 77% answered ‘full control’ (which is lower than the 90% answering the same to the Pega European survey in 2017, and the 91% who replied to the doteveryone survey to say it was important to be able to choose how much data they share with companies). 

Only 1% of our respondents wanted no control, which aligns with the Big Brother Watch Survey where 0.7% responded similarly. 

 

GDPR awareness

We ran the survey with just over a month to go before the implementation of GDPR. Sixty percent of our respondents said they knew about it and what rights it will give them. Although this suggested high levels of awareness of GDPR as an event (according to Kantar TNS' GDPR Awareness Index, only 34% of the general public were aware of GDPR in February), only 19% said they knew a lot about how to use the new rights it would give them.

One of the most interesting aspects of our survey was the responses to a series of questions on the new rights that individuals would be granted under GDPR, which we set out in our survey results update. Taking the right to erasure as a point of comparison with other surveys, national surveys have suggested that 62% welcome the right to erasure (SAS) and, at a European level, that up to 93% of people would erase their data if they weren’t comfortable with how they thought companies used it (Pega). In our Corsham survey, people said they would be less likely to ask for erasure than to rectify their data. Again, those over-65 were most likely to ask for their data to be erased: 87% were either likely or very likely to do so. 

 

What next?

So, we have lots to explore further with the community, and we will be interested in their feedback on our findings. In the coming months, Ci will work with local Corsham community groups, organisations and individuals to discuss the survey results and identify, then co-produce, the information they need to help them understand their rights, and how and when they can use them.

Ci will use the insight and evidence gathered via from the community to feed into our Digital Trust project, where we are working with partners to influence a regional and national debate with policymakers, other influencers and organisations. We will report back on the next stage of that work here soon. 

In focus: data, digital rights and our connected society

In focus: data, digital rights and our connected society

On 25 May, our data rights will change with the introduction of the General Data Protection Regulation (GDPR). Over the next fortnight, the Corsham Institute (Ci) and RAND Europe Observatory for a Connected Society  – the only app and web platform dedicated to bringing together the latest research, insight and analysis on all aspects of digital and tech policy – will be focusing on the topic of data. 

During this period, the Observatory will host:

The Observatory for a Connected Society will also be teaming up with techUK to support their Data Protection Week which kicks off on Monday 21 May.

As well as providing the latest insight on all things digital at your fingertips, the Observatory has recently launched its new community section where you can post comments, start discussions and build your own network. Download the app and get involved! We look forward to hearing your views.

Artificial Intelligence (AI) and its potential impact on the UK

Artificial Intelligence (AI) and its potential impact on the UK

Artificial Intelligence (AI) and its potential impact on the UK has been a hot topic this week, with the publication of the comprehensive report from the House of Lords Select Committee on AI. AI in the UK: ready, willing and able put forward many sensible recommendations, which Ci supports, on areas such as: skills, education and lifelong learning; data sharing and trust; and, fundamentally, the need for AI to be designed and developed for “the common good and benefit of humanity”. Greater public dialogue and engagement is urgently needed – not just on AI and its application in the future, but also on the pace of data-driven change in the here and now. We’ll be returning to that theme next month as we publish the results of our “Your Data, Your Rights” survey and set out the next steps on that work. 

So it’s incredibly timely that, after a successful 2017 event, techUK is running another “AI week”, bringing together news, insight and different perspectives on the opportunities AI can bring to the UK from a variety of leading experts, industry champions and thought leaders. We’ll be supporting this on the Corsham Institute and RAND Europe Observatory for a Connected Society which will feature three exclusive comment pieces next week from: Sue Daley, techUK’s Head of Cloud, Data and AI; Andrew Burgess, a leading expert and author on applying AI in business; and Rachel Neaman, Ci’s CEO. The Observatory already hosts lots of recent research and analysis on AI and related themes, which we’ll be pulling together into handy digests; plus it features all the highest-profile events, conferences and other activities on AI in the UK in the months to come. Do download the app to find out more; and we will also be sharing some of the best of techUK’s “AI week” content there next week too.

Your Data, Your Rights survey launched

Your Data, Your Rights survey launched

YDYR Logo.png

Today, (3 April), Ci has launched a new survey to benchmark individuals’ knowledge and attitudes to their data rights.

With recent headlines dominated by the use of our personal data and the introduction of new rights under GDPR by the end of May, understanding people's attitudes regarding what data is collected, shared and used about them is both vital and timely.

The survey focuses on people who live, work or study in our Digital Corsham community and forms part of Ci’s Communities Programme. It is the initial stage of a longer term project, ’Your Data, Your Rights’, which will use the survey to identify the information people need to help them understand their rights and how and when they can use them. 

Ci’s CEO, Rachel Neaman, commented that, 

“With growing public concern around the security and privacy of personal information it’s more important than ever that everyone is fully aware of their data rights and is properly informed about how to act on them. 

The ‘Your Data, Your Rights’ project in Corsham will inform the wider debate and thinking about data rights and the use of personal data.”

If you live, work or study in the Corsham area, please take the survey. It only takes 10 minutes. 

Trust, data and digital rights

Trust, data and digital rights

In just over two months, on 25 May, the General Data Protection Regulation (GDPR) comes into force in the UK. There’s no shortage of advice for businesses on preparing for it – from the detailed guidance and resources produced by the Information Commissioner’s Office through to a rapidly growing industry of GDPR seminars, conferences and blogs.

If we look at it from the perspective of “data subjects” – that’s all of us as individuals – the GDPR will enshrine a host of new rights that we can action in relation to the data that companies and organisations hold on us. The recent Digital Leaders blog by Catherine Knivett, Ci’s Head of Partnerships, sets these out in more detail. These new rights will undoubtedly change the relationship between individuals and any organisation that holds their data. And, under the headline-grabbing shadow of a fine of up to £17m, or 4% of global turnover, the focus in the run-up to 25 May is on how organisations can demonstrate compliance with GDPR.

Compliance suggests a reluctant, reactive, “if we must” burden. But these rights are vital for individuals. Those individuals are customers, consumers, clients, subscribers, service users, patients, members of professional or social communities – the people that organisations should care about. So this is an incredible opportunity: an opportunity for companies to demonstrate how much they value their relationship with the people whose data they hold, how they can transparently and proactively make that data-exchange relationship better, and how they can improve it for the long-term, to the benefit of both sides. At our recent Digital Leaders South West salon, we explored some of these issues and you can read the reflections of one of our guest speakers, MyLifeDigital’s J Cromack, here.   

So how much do we know about people’s attitudes to data: its protection and their rights to privacy and control? Do we know how this links to their level of trust in organisations that hold their data? And what do they want those companies to do to improve their understanding and trust? Thanks to a number of recent surveys and analyses of public attitudes to data and technology (for example, from the Open Data Institute (ODI) and Doteveryone) we are getting a fuller picture of what people think, and what they might want in return. For example, Doteveryone found that:

95% of people say it’s important to know their data is secure

94% say it’s important to know how their data is used

91% say it’s important to be able to choose how much data they share with companies

51% would like to know how their data is used but can’t find out

 

In the ODI’s survey, 94% of respondents said trust was important in deciding to share personal data. It also found that 33% of respondents would feel more comfortable sharing data if organisations explained how it is used and shared, and 18% would welcome step-by-step instructions from organisations about how to share data safely.

We have pulled together more of the recent survey findings on people’s attitudes to data sharing in the slideshow below.

But how much do we know about how people are going to react once GDPR comes into force? Well, not quite so much. A survey of 7,000 consumers across seven European countries in December 2017 asked people what they identified as the most important rights under GDPR:

  • 47% of the respondents identified the ability to simply see the information the companies hold on them
  • 22% identified the ability to demand they erase their personal data
  • 9% identified the visibility of when their personal data is used to make automated decisions.

However, a whopping 93% of European respondents said they would erase their data if they weren’t comfortable with how companies were using it. (An interesting footnote from this survey is that UK residents appear to be the least likely to act once GDPR comes in: 74% compared to 82% overall in the survey.) When it comes to it, will people act? If they want to erase their data, will they know how to? Will they understand what the implications are: what they might miss out on, as well as what they might gain in terms of greater control? That’s just one of the things we intend to investigate in our new Ci Communities project: “Your Data, Your Rights”. Read more about it here.

This project goes to the heart of Ci’s mission to empower people and build trust. If you want to get involved, or find out more, contact us at info@corshaminstitute.org and we’ll be reporting back regularly on progress on this blog.

Sam's story - National Apprenticeship Week 2018

Sam's story - National Apprenticeship Week 2018

Last week we celebrated National Apprenticeship Week with Digital Leaders. Our current apprentices, Sam and Kara, are completing their qualifications in Creative & Digital Media. You can find out more about their experience by reading Sam's blog below.

My name is Sam Bishop, and I am a 19-year-old Junior Content Producer here at Corsham Institute. I started my apprenticeship in September 2017, and have found it completely eye-opening and rewarding, right from the start.

I wanted to do an apprenticeship after finishing sixth form as I felt that it was a beneficial way to learn and develop skills in areas I have an interest in, all while gaining a qualification and building a portfolio. The reason why I chose Corsham Institute is because filming, editing, social media, podcasts, writing, and other forms of digital media really appeal to me, and the amount of work Ci does in these areas, and the extent of their professionalism, was clear. There are many things I enjoy about my role. I enjoy meeting with new people with a variety of backgrounds, going to new places and shooting videos around a real range of interesting topics.

Post-apprenticeship, I would love to do more film production work, with the skills I have developed through being at Ci and the experience I have in film studies from A-level, helping me to discover my love for the production and post-production side of filming. Whether I look for a position, internship, apprenticeship or University course is unclear at this stage, but Ci has helped open my eyes to discover my true passion.

The skills, both creative and business related, that I have developed is rather extensive. My time management has been improved, through trying to meet strict deadlines, forward planning and creating meeting arrangements. I have gained a greater knowledge of social media, through helping manage the Twitter, Facebook, Instagram and LinkedIn accounts. The change from sixth form to apprenticeship is completely different. That was something I noticed quite early. With school, you were only really looking after yourself, with your lateness, punctuality, professionalism and approach to learning only affecting you. In an apprenticeship, if you are half-hearted with any of these factors, not only you feel it, but the others around you. Corsham Institute is very much a team, that pull together to create some fantastic pieces of work. That is because everybody works hard. Team-work and professionalism are by far the most valuable skills I have developed from being at Ci.

There have been many highlights so far in the apprenticeship, but these are by far my favourites:

·      Filming, interviewing and editing a video with children’s illustrator

·      Travelling to Cambridge to shoot a video with RAND Europe’s Hans Pung

·      Running social media at the House of Lords for the launch of the Observatory app

·      Assisting videographer Remco Merbis at the Vision Conference in Bristol

·      Shooting and editing a short advert style video for Interactive Scientific

·      Travelling to Gloucestershire to shoot a video for the Cyber4School® pilot

·      Helping collate data for the Digital Corsham Safer Internet Day survey

·      Editing the Safer Internet Day video under a strict timeframe

·      Running the monthly social media report and the weekly communication report

·      Learning filming and design at Cirencester College.

I would heavily recommend apprenticeships as I feel it allows you to improve professionally, while gaining valuable experience in a field of your interest, as well as meet new people and prepare you for a range of work environments, all whilst receiving a qualification. It is an opportunity I am delighted I had, and I can’t thank those at Ci enough.

Kara's story - National Apprenticeship Week 2018

Kara's story - National Apprenticeship Week 2018

Last week we celebrated National Apprenticeship Week with Digital Leaders . Our current apprentices, Sam and Kara, are completing their qualifications in Creative & Digital Media. You can find out more about their experience by reading Kara's blog below.

My name is Kara and I’m a Junior Content Producer at Corsham Institute. I help plan, film and edit videos, create animations and graphics to support the work done at Corsham Institute. I’m also learning about creating interactive media such as games and VR content.

I chose the apprenticeship pathway as it offered the work-based learning and office experience I was looking for, as well as providing an opportunity to learn new skills and gain a qualification. I had already completed a University degree in Animation, so I also wanted a job that would allow me to apply the skills I had learnt to a work environment. Ci has allowed me to build a portfolio of client-based animation work as well as building skills in new areas, such as live action filming and editing.

Ci appealed to me because I was interested in the type of work they’re doing around digital skills and the future of the tech world and I enjoy the variety of projects I get to work on via my apprenticeship. One day I’ll be filming, the next editing and then I might be working on an animated graphic, such as the video I created for our House of Lords event. The chance to build all of these skills is really valuable. I also appreciate the support we’re given from Ci, as even though we’re still learning our roles, our co-workers respect our skills and listen to the feedback we give, which makes me feel like a real member of the team. 

After my apprenticeship I’d like to continue to work in content creation, maybe moving more towards games and interactive content. This apprenticeship is helping prepare me for future careers both from a technical skills point of view and by helping me experience the softer skills side. Things such as writing emails, planning projects, communicating with colleagues, managing budgets, risk assessments, are all valuable skills that help prepare me for the rest of my career. Technically I’ve learnt about many content creation areas and also had the chance to work with new technology, such as the VR lab at Bristol University, through Ci’s industry links.

Some of the highlights of things I’ve worked on so far would be interviewing local artists for our Peacock Arts Trail video series, creating a video for Jamie’s Farm, and being able to experience VR software and training sessions. I’ve also really enjoyed filming and editing the TEDx Corsham event. 

I would recommend apprenticeships as a good way to get into the industry if you know what career path you’d like to pursue as you get to learn relevant skills, develop a portfolio and build experience that will benefit you throughout your career. 

Dataclysm, and our 100,000th download

Dataclysm, and our 100,000th download

Some time this week, our podcast Ci - Countering Violent Extremism will celebrate its 100-thousandth download. That will be a proud moment for us, and we’d first like to say thank you to all the listeners around the world who’ve followed us through our first ten episodes. With this eleventh, we’ve hit a milestone far beyond our original ambition.

Our podcast started September 29th last year. We aimed to bring the latest information and thinking about online radicalisation to not only specialists in the field, but also to frontline workers and folks on the street.

We’ve interviewed everyone from policy makers, through journalists and writers, to trainers and specialists in targeted data.

The first interview of our first episode, a talk with International Centre for Counter Terrorism Director Dr Alistair Reed, began with the phrase “Words matter”. We followed up by interviewing writer Cathy Otten.

Words matter, indeed. Dr Reed led us through the subtle and not so subtle differences between the variety of nominators of counter-terrorism. We explored the British government’s position, which emphasises Countering Extremism - a kind of ideological approach that attempts to push people toward ‘right’ thinking. And we looked at Countering Violent Extremism, an approach that takes the stand that radical thought is a normal part of daily political discourse, and that only ideologies leading to violence are worthy of address at the government level.

Cathy Otten then took us through a harrowing journey, recounting the experience of Yazidi women trapped by ISIS, forced into sex slavery, murdered for their history and ideology, and eventually forced to flee Mount Sinjar, with many crossing the desert on foot to relative safety in Iraqi Kurdistan.

But we’ve also covered topics as broad as fake news and, in our latest episode, data protection laws and surveillance practices in China, the EU and the United States. What do data protection laws have to do with Countering Violent Extremism? Everything.
Individuals throughout the world are as likely now to encounter news and information online as through a traditional broadcaster or publisher. And often they’re yielding their data as they do, leaving footprints, revealing habits, preferences, religion, ethnicity and political beliefs.

In doing so, we all make ourselves easy targets, not only for advertisers and social platforms, but also for governments keen on social control, as evidenced by social credit score in China. But we’re also easy targets for security services, and for foreign bad actors like Putin’s Russia, seeking to force citizenries into rival silos of information, undermining democracy and the rule of law.

Cyber4Schools® helps children stay safe online - watch the video case study

Cyber4Schools® helps children stay safe online - watch the video case study

“You never know who you can come across online, so I think it’s important to stay safe.”

- Year 7 student

 

“If we can prevent young people from becoming the victims of the future then surely that’s the best thing we can possibly do.”

- Deputy Police and Crime Commissioner for Gloucestershire, Chris Brierley

 

These are some of the comments taken from our short video case study of the Cyber4Schools® pilot which is currently taking place in Gloucestershire. Supported by the Office of the Police and Crime Commissioner, Gloucestershire Police and Gloucestershire County Council, Cyber4Schools®  is a learning programme to help Year 7 students stay safe online. It’s delivered in school using experts including the Police, and is one of the approaches to helping children stay safe online Ci is currently learning from and evaluating.

In the case study, the Head of Year 7 and Assistant Head of Chosen Hill School in Gloucestershire reflect on the need for online safety lessons, particularly for this age group. Deputy Police and Crime Commissioner for Gloucestershire, Chris Brierley, provides the Police’s perspective. Students after the lesson remark on the impact it’s made on their behaviour. 

Rachel Neaman, Ci’s CEO, explains the importance of helping all young people to have the skills to use digital tools safely and confidently.

 

This video is currently unavailable

 

Ci is keen to work with partners and we’d love to hear from organisations that want to work with us. Please get in touch.

Corsham’s Safer Internet Day - video

Corsham’s Safer Internet Day - video

“It’s important to be safe online, because lots of people are being bullied...”

 

“If people don’t learn about online safety… anything can happen”

 

These are just two of the comments from young people in Corsham, featured in the video Ci filmed in a number of Corsham area schools as part of our work on Safer Internet Day 2018.

Teachers also reflect on the work being done to bring the schools and community together to empower young people to use the internet safely; and Ci’s CEO, Rachel Neaman sums up the importance of this work:  “We want to create informed citizens of the future and by involving the whole community in the conversation about this, children, parents, teachers and carers we can start to have some real impact.”

 

This video is currently unavailable

 

Other activities to support Safer Internet Day in Corsham include an exhibition of findings from the survey of 2000-plus pupils, which runs at the Springhill Campus throughout February half-term before touring the participating schools.