Political Engagement in Digital Democracies

Political Engagement in Digital Democracies

So what is political engagement? Traditionally, it’s meant turning up to cast a ballot in elections, maybe running for the council and joining a party. But in a digital age it means a lot more. We can readily sign petitions, communicate and build movements on social media, speak directly to our MPs online and watch a live stream of what is being debated in Parliament. Theresa May has even turned herself into a Youtuber in the last few days [let’s see if that continues]. There are sites which claim to fit your policy ambitions to the policies of the political parties to help you vote. There are grassroots organisations like More United and Citizens UK which help you construct local campaigns to fit the causes you care about.

This is all very well and good, but it seems that politicians are unwilling to take advantage of these new communicative techniques. They can also be gamed by malicious actors. The petition to cancel Brexit and Article 50 received over 6 million signature was quickly dismissed by government.

Joining me is James Young and Bess Mayhew. James has been with us a couple of times before and is a BBC3 technology presenter. Bess is the CEO of More United which is a movement making politics fit the 21st century. More United runs grassroots campaigns that connect people directly with MPs from different parties who are prepared to work together on issues they agree on. So far they have grown to 150,000 people and have worked with over 50 MPs.

Useful links mentioned on the podcast

The Russell Brand interview with Jeremy Paxman in 2013

What is Representative Democracy – check out Wikipedia here
What is Direct Democracy – check out Wikipedia here
Here’s a list of different electoral systems – again from Wikipedia!

Please do review us nicely

Go on give us a 5 star review on iTunes, Acast, PocketCast or whatever you like listening to us on.

Please do get in contact if you like this episode You can contact James on Twitter here and Bess on Twitter here. I am on Twitter @alicelthwaite and my email is alice [at] twaaats [dot] com.

What is Chinese Social Credit?

What is Chinese Social Credit?

Chinese social credit is perhaps the most famous policy to come out of China in recent years. In the West, we tend to criticise and fear it, but we don’t really know what it entails. So, in this episode, we want to actually learn what social credit is all about. Is it really a thinly disguised veil for the actions of an authoritarian state? Or is it actually a practical means for understanding whether you should loan someone some money.

I’m joined by Adam Knight who is an independent researcher covering the intersection of public and private actors in the regulation of the Chinese internet space. He is also the cofounder of China e-commerce advisory company, TONG Digital and freelance producer with Al Jazeera English. He’s got a lot on his plate.

Sam Johnston joins me to probe Adam. He’s an ex-technology consultant who is writing a book about freedom of speech.

Contact information

My email is alice [at] twaaats [dot] com and my Twitter is @alicelthwaite. Sam is @samuelbjohnston and Adam is @adamdknight

Please do review us nicely

Go on give us a 5 star review on iTunes, Acast, PocketCast or whatever you like listening to us on.

Moral Panics and the Internet

Moral Panics and the Internet

At the end of February 2019, the UK experienced a large moral panic about a fake story. It was rumoured that there was a ‘challenge’ targeted at children which encouraged them to harm themselves. This challenge was called Momo. It was a hoax. Yet schools sent out advice as to how to protect children from Momo, and even the BBC published warnings. For this episode we’re looking at moral panics and the internet to see if we can learn why this happened.

Warning – this episode does discuss themes of suicide. If you are in need of support please do contact the Samaritans on 116 123.

I’ve got 3 amazing guests with me for this episode. Gemma Gronland is a researcher at UCL, specializing in how public sector workers enact their duty to report children at risk of radicalization, which has at its heart questions around child protection and managing children’s online lives. Siddarth Venkataramakrishnan is an FT journalist by day and an internet subculture hunter at night, focussing on the extreme and the unusual. James Young is a documentary maker. His recent documentaries have appeared on BBC3; Can Robots Love Us? and Sex Robots & Us: The Future of Sex.

What is a moral panic?

Moral panics – periods of irrational fear and often hate towards a chosen group or object – are by no means modern, but the internet has transformed them: ironically, it’s both the object of multiple moral panics, and central to spreading them. Panics may be started by earnest but misled citizens, but often become tools for political power. And because moral panics usually deal with (grossly overstated) threats to children, they pose serious issues around safeguarding.

TWAAATS podcasts mentioned in the episode

Identity and the Alt-Right with Andrew Strait and Sid Venkataramakrishnan

Data, Games and Bandersnatch with Jazza John and James Young

Are You in a Relationship with a Machine? with Nika Mahnič and Tulsi Parida

Excellent Article by Joan Donovon in Buzzfeed

Opinion: Momo Is The Oldest Kind Of Story: Don’t Leave Your Kids Alone In The Woods

Moral panics and Ebola

I had a chat with Sid after we recorded the podcast to see if Ebola really wasn’t a moral panic. It was on the news a lot – and there were scares about women going abroad and not being able to get pregnant. I feel we went on a slight tangent with this one. However, Sid did say that Ebola wasn’t in the same league as other health cares like AIDs (which definitely contained an element of moralism).

We record these live. I try and speculate as little as possible, but I also try and ask questions that other people may be asking. Sometimes I get it wrong. I just can’t understand why some things spread like wildfire and the truth doesn’t. If we discuss this again I’ll make sure to get a psychologist on.

Have something to say?

As mentioned at the end of the podcast – I sometimes get great feedback on social networks about the podcasts I am on and produce. Please do get in contact if you like what we do – it really means the world! You can contact James on Twitter here, Sid on Twitter here and Gemma on Twitter here. I am on Twitter @alicelthwaite and my email is alice [at] twaaats [dot] com.

Should Internet Access be a Human Right?

Should Internet Access be a Human Right?

Are we at a stage where access to the internet should be a human right? Join Alice Thwaite, Vidushi Marta (Article 19) and Areeq Chowdhury (WebRoots Democracy) to talk about human rights! It’s true that the internet is the facilitator for many human rights. A couple picked out from the UN Universal Declaration of Human Rights are: freedom of expression, freedom of assembly, the right to work, right to education and the right to freely participate in the cultural life of the community. We rely on internet access for many of these things in a modern world.

Some states actually switch off the internet, or social media platforms, to stop protestors and riots. Bloomberg reported that in January alone there were internet shutdowns in 4 African states: Sudan, Gabon, Zimbabwe, DR Congo. For example, in Sudan, there was a shutdown on social media as protestors called for the President, Omar al-Bashir to step down. In DR Congo, the whole country had no internet for 20 days following a contested presidential election.

However, worldwide universal internet access is still a pipedream. In June 2018, 55.1% of the population was online. This is much higher in developed countries, but even then a significant portion of the population rarely uses the internet. In the UK, 89% of the population uses the internet weekly, which means 11% do not. So let’s have a conversation about accessing the internet and human rights.

Guests

Areeq Chowdhury is the Founder and Chief Executive of the think tank WebRoots Democracy which explores the intersection of technology and democratic participation. In particular, it focuses on online voting in elections as well as the regulation of social media platforms.

Vidushi Marda is a legal researcher interested in the interplay between emerging technologies, policy, and society. Currently a Digital Programme Officer with ARTICLE 19′s Team Digital, her primary focus is on the ethical, legal, and regulatory issues that arise from algorithmic decision making. She also works on strengthening human rights considerations in internet infrastructure, particularly at internet governance bodies like ICANN and the IEEE.

Interesting links

Vidushi mentioned a couple of UN resolutions which mean that your online rights are equal to your offline rights. Read the Article 19 statement on this here.

If you’d like to know more about ISPs (Internet Service Providers) check out the Wikipedia page here.

Email meeee

My email is alice [at] twaaats [dot] com and my Twitter is @alicelthwaite. Areeq is @AreeqChowdhury and Vidushi is @VidushiMarda.

Go on give us a review

If you like the podcast then please review us on iTunes, Acast, PocketCast or whatever you like listening to us on.

The Ethics of WhatsApp

The Ethics of WhatsApp

Whatsapp is a platform which is widely used, and widely exploited. This is a podcast of two halves. The first will talk about disinformation on WhatsApp. You may have heard about fake news in India, but disinformation is spread in other countries too. For this discussion, we focus on Brazil.

In the second half we’ll talk about how bankers and the finance industry is misusing WhatsApp. Although most banks have banned the app, bankers continue to use it because it is so simple and effective. However, this means they may be giving client data away, and perhaps more importantly (at least, from societies point of view), their conversations cannot be audited.

As always, I’ve got two wonderful guests with me. Charlotte Wood, who specialises in innovation, finance and FinTech. For her day job she is Head of Innovation at a financial organisation. She graduated from the University of Cambridge with a first class degree in Neuroscience, followed by a year of Management Studies. I’m also with Caio Machado He is a lawyer, Fellow at the Institute for Technology and Society (ITS-Rio), and researcher at the Computational Propaganda Project (ComProp). He holds a Masters degree in Internet and Data Law from the Université de Paris 1: Panthéon-Sorbonne, and holds a Law degree from the Universidade de São Paulo (USP). He is currently pursuing an MSc in Social Sciences of the Internet at the University of Oxford.

Metadata

We discussed metadata in the podcast. This is a set of data that describes and gives information about other data. As you may have guessed, that definition was lifted straight from my friend, Google. Essentially, metadata in messages will tell you everything about the message, without actually saying what the message is. So if I message my friend Bex: “Hey, we on for dinner tonight?”, Facebook will know that my user ID messaged Bex’ user ID, at a particular time, from a particular location. They just don’t know what I actually said. Metadata can give away a lot about your comings and goings. You can perhaps learn more from metadata, than the actual data itself.

Disinformation vs misinformation

The latest (Feb 2019) DCMS committee report said ‘disinformation’ describes “the deliberate creation and sharing of false and/or manipulated information that is intended to deceive and mislead audiences, either for the purposes of causing harm, or for political, personal or financial gain”.

Misinformation is when you forward on an article which is fake, but you believe that it is true. Therefore, your intention is not to deceive, or mislead, your audience.

Get in touch

My email is alice [at] twaaats [dot] com and my Twitter is @alicelthwaite.

Please leave us a nice review 🙂

If you like the podcast then please let other people know.

The Rights of the Dead

The Rights of the Dead

The dead may well outnumber the living on Facebook within the next 5 decades. These people could be tagged in a photo, friend requested or wished a ‘happy birthday’. What kind of rules should be set here? Should your account be deactivated? Can someone else claim your username after you die?

So there are platform specific issues around death. But there are also a whole host of services cropping up which Carl calls ‘the digital afterlife industry’. These include information management services, which help families cope with how digital assets (I assume like photographs, messages etc) are handled after death. Then we have posthumous messaging services. The company might send you an email, and if you don’t reply, they assume you’ve died. You can then arrange for other people to get specific messages when you’re dead. Online memorial services commemorate the deceased’s life in many interesting ways. ‘Re-creation services’ look to recreate the person who has died using their digital footprint.

Carl Öhman is completing his doctorate at the Oxford Internet Institute. His research looks at the ethical challenges regarding commercial management of ‘digital human remains’. This is the data left by deceased users on the Internet. He is an expert when it comes to digital death.

Sam Johnston is an ex-technology consultant who is writing a book about freedom of speech. I’m hoping to recruit him to become a regular twaaat.

Interesting articles mentioned in the podcast

Carl’s research can be found here in Nature, and the study he mentioned on the growing number of dead people online can be found here. If you’re after something a bit more academic then take a look at his paper here.

If you liked the idea of bloody others and non-bloody others, please check out: are you in a relationship with a machine?

Please talk to me and leave a nice review

Email me or contact me – alice [at] twaaats [dot] com or @alicelthwaite on Twitter. Sam is @samuelbjohnston and Carl can be stalked on the internet (his words, my paraphrase).

I really appreciate the emails I get from you guys. And also that the listener body seems to be growing, and I have no idea why. So thanks for spreading the word!

Facial Recognition Systems

Facial Recognition Systems with Dr Lisa Murphy and Rula Awad

Today we’re talking about facial recognition systems. These are technologies which can recognise and analyse your face. Used for both identification (i.e. on Facebook in photos, on security doors with restricted access) and for reading emotions (i.e this film makes you happy, this product makes you sad), they are becoming more and more common, and perhaps without you even noticing it.

I’m joined by Dr. Lisa Murphy, a clinical fellow at Public Health England, who wants to use technology to improve population health, clinical care and NHS operations. She is also a fellow at Newspeak House, which is a community in East London which we all make fairly regular use of. I’m also with Rula Awad, the founder of AIPoli. She is an advocate for information and algorithmic transparency, particularly when it comes to political and social content. Rula has built facial recognition systems in the past, and so can give us insiders knowledge on how bias is perpetuated throughout these systems.

Facial Recognition and Medicine

Diagnosing genetic systems – read more here.

Read this medical journal article about using facial recognition systems to predict cardiovascular disease.

Shopping and Marketing

We mentioned an article on how shops are using facial recognition to improve their marketing, product placement and advertising. Read more here.

Algorithmic Bias

Rula mentioned the Gender Shades study. Read more about it here.

We still find bias in these algorithms – in January 2019 it was found that facial recognition systems still find it hard to identify coloured and female faces. Read more here.

Wanna contact us?

Email me or contact me – alice [at] twaaats [dot] com or @alicelthwaite on Twitter. Lisa is @LisaMurphy107 and Rula can be found here @rula_awad.

Thanks for all your continued support! Go ooonnn give us a nice review on your favourite podcasting network OR just tell all your friends whenever you see them.

Are You In A Relationship With A Machine?

We interact a lot with machines. Whether we’re using our mobile phones, giving Alexa commands or even playing with a (sex) robot), we’re now giving these devices more intimate information than ever before. In a way, this means that we might feel like we are in a private relationship with them, but the reality is that a lot of corporations hold this data.

But we’ve always had a relationship with machines. We’ve found dildos that are 28,000 years old, we’ve fantasised about cars and statues, and sometimes our social status depends on the machines we own and interact with. So what really has changed?

For this episode I’m joined by Tulsi Parida and Nika Mahnič. Tulsi cares about reducing digital inequality and promoting responsible/inclusive technology. She has worked in startups in India and the States that look at the intersection of literacy and tech.  Tulsi recently completed an MSc at the Oxford Internet Institute, where she studied mobile learning in emerging markets through a gender and political economy lens, and is currently pursuing an MBA at Saïd Business school. Nika is an interdisciplinary researcher, writer and dancer. She wrote her BA thesis on sex robots at the Faculty of Social Sciences, University of Ljubljana, and obtained her MA in Big Data at King’s College, London. Her MA thesis focused on the first proactive companion robot and manager of IoT devices employing emotional artificial intelligence. She is a co-founder of Danes je nov dan / Today is a new day institute and an activist at the Campaign Against Sex Robots.

Useful links for the podcast

Here are some interesting articles and videos from Nika. Watch her talk about sex robots here, and read her articles on ‘bloody others’ here.

We spoke about Alexa and misogyny in the episode. There are a couple of articles on this as listed below, but I couldn’t find anything about whether using Alexa the machine increases misogyny towards real women. Consequently, I’m loathed to say this is true. If you have any more information please shoot it across!

Here’s a little pic of us recording the podcast. My flat is quite light and so we look dark in the foreground. But it’s still nice to ‘know’ that we might be real people and not machines 🙂

Alice Thwaite, Nika Mahnič and Tulsi Parida

Spread the word!

This is a new series so please do help us spread the word that we’re back! Feel free to give us a 5* review on your platform of choice. Email me or contact me – alice [at] twaaats [dot] com or @alicelthwaite on Twitter.

Til next time!

Data, Games and Bandersnatch S2E1

Privacy Data, Games and Bandersnatch with Jazza John and James Young

WE’RE BACK! HAPPY NEW YEAR! This week I’m joined by two very special guests – Jazza John and James Young. We’re taking about gaming and data collection with a focus on Netflix’ latest Black Mirror episode entitled: Bandersnatch.

Last year we woke up to the consequences of Facebook collecting huge amounts of data about us. So for the first episode of the new series I wanted to take a look at the data collection that happens within games and games companies. Whereas 20 years ago, all your gaming data was stored on a local drive attached to your PlayStation or Nintendo, today that data is fed back to the gaming industry. Netflix launched a new game over Christmas as part of the new Black Mirror series called “Bandersnatch”. Should we be concerned about whether this behavioural data will be fed into their recommendation algorithm? Or instead is the biggest concern about the creativity of game designers? Will data collection mean that there are fewer innovative ideas and games as every developer seeks to make the next blockbuster?

James Young is a documentary maker. His recent documentaries have appeared on BBC3; Can Robots Love Us? and Sex Robots & Us: The Future of Sex. Jazza John is a Youtuber, podcaster and a professional internet show off. We all know a fair bit about technology and love a good game.

Interesting links mentioned in “Data, Games and Bandersnatch”

My article in Quartz called: “Black Mirror isn’t just predicting the future – it’s causing it” This pretty much summarises my point of view on Netflix and Bandersnatch 🙂

We talked a bit about #Gamergate and narrative-driven gaming. My favourite article on Gamergate is in Vox and written by Todd VanDerWerff – check it out here.

One of my favourite mobile and console games is called Kingdom. Check it out.

Last year we recorded an episode with Jazza on Cambridge Analytica and the scandal that erupted. Check out that ep here.

There is NO LINK between playing violent video games and increased real-life violence

This is a really important point, and I feel we didn’t hammer this home enough in the episode. For more information check out the Oxford Internet Institute’s research here: “Poor behaviour ‘linked to time spent gaming not types of games'”

Tell all your friends and family and enemies about us

This is a new series so please do help us spread the word that we’re back! Feel free to give us a 5* review on your platform of choice, drop me a line telling me you love me (alice [at] twaaats [dot] com or @alicelthwaite on Twitter) or sharing this ep online.

Til next time!

SPECIAL: Alice Thwaite on Echo Chambers

In these last two episodes of TWAAATS we interview each other about our specialist subject areas. In this episode, Andrew Strait chats to Alice Thwaite about her work on echo chambers. This is an area that Alice has spent a number of years researching, through her work on The Echo Chamber Club. She came to the Oxford Internet Institute with the goal of updating and adapting the theory associated with echo chambers. It seemed that previous incarnations had a number of faults. For example, no one had seemed to differentiate between groups that are different, that disagree and that are polarised. No one had sorted out the link between access to information and polarisation. Listen now to the conclusions she reached.

Get in touch

Email Alice here, and Andrew here. We can also be found on Twitter: @alicelthwaite and @agstrait.