Oblivious design

A transcript of Episode 166 of UX Podcast. James Royal-Lawson and Per Axbom talk about how our focus is normally on very specific scenarios and how this causes potential uses (and abuses) to go overlooked.

Featuring an interview with Adam D. Scott about ethical development.

Transcript

[Music]

James Royal-Lawson: A few weeks ago, you woke up really far too early on a Sunday morning and were trying to get back to sleep and you had an idea of the Dick Pic Locator which you’d got as a — as impression you had — a great episode of Note to Self where Manoush Zomorodi explores all the data that you can find out about a person just from a simple selfie.

Per Axbom: Uh-hmm.

James: So tell us a little bit more about the service and why you did a Dick Pic Locator.

Per: I think maybe it was just that. The name came first, Dick Pic Locator, because I started realizing what a huge problem this seems to be on the internet that dudes are sending pictures of their penis to girls, unsolicited pics, anonymous pics, and chats, and forms, emailing even sending via MMS. It’s crazy. It’s a phenomenon and some women get this every day.

James: Uh-hmm. It’s terrible.

Per: So and then I remember this episode and I realize, well, we could probably get a lot of data from those photos that are being sent to these women and I realized, well, that’s open data. It’s metadata. It’s part of the photo. So it must be really easy to get that data online and I searched online and there’s tons of scripts for it. So I realized, I can build this really quickly and I built a site including the domain name and the text, and the copywriting in six hours and realized, okay. So this was easy. Now, let’s see what happens with it.

James: So the service itself you just basically upload a picture, supposedly of a penis that you’ve received and then it will tell you where about in the world, exactly, that picture was taken.

Per: Yeah.

James: It’s the premise for it. But in reality, a lot of the pictures in question won’t have their GPS data attached in them anymore.

Per: Exactly.

James: So the true intention…

Per: I’m not sure I had an intent… When I — when I built the service like I don’t think I really was aware of how few photos do still have that data when they’re sent over chat programs because the data is there in the photo, but..

James: Originally

Per: …a lot of the — these services actually strip away the data and that may be a problem in itself. Why is it manipulating my photo that I’m sending to someone else? So we’re — I’m dealing with two things — partly of course I want to attack the problem of Dick Pics because I think that’s a huge problem.

James: Yeah. The bullying, the intrusion, the…

Per: But there’s also the problem of all these data and photos that obviously not a lot of people are aware is there. So you don’t know what types of risks you’re taking online when you’re even sending just a photo to another person or posting it online or posting it on a buy and sell site, whatever.

James: Oh, how much information you’re revealing depending on how you sent the picture because of course you’ve — if you got your iPhone or whatever it is and you’ve taken a photo by default it’s going to have all these of data stored there.

Per: It’s not just the GPS data, it’s the — it could be the color of your phone, what type of phone it is, so much — so many more things.

James: And we — I mean we weren’t completely aware that some of these data was stripped from certain services. I had — I probably guessed it would have been actually from some of them because they — because they reformat the picture so they basically kind of compress it again or turn it into a thumbnail. You generally don’t bother to transfer the exif.

Per: But that’s a privacy issue as well if I’m sending something, why would you manipulate my photo before it comes to the receiver? So I know that Signal, for example, this privacy chat app. It does not strip this text and discord app that my kids use actually to talk when they’re gaming. That does not strip the data and the photo.

James: Sometimes it might just be an accidental consequence of something else like generating a thumbnail, but I think we’ll get into that a little bit later. One thing here which really interested me is Me and you were discussing a little bit because you had a bit of a media storm around this.

Per: Yeah. That was insane. I was on the radio. I was on TV. I was — I even reached Australia and Germany. They were talking about it in The New York Post, The Sun — in Britain, Russian newspapers posted about it and so it’s just insane couple of weeks.

James: Yeah.

Per: It became a summer story and a lot of big newspapers sites this last few weeks, which was very interesting in itself but when me and you were chatting about it, we realized that an aspect of this is not just the — well, the history of Exif, that Exif is the data format — the metadata format that we attached to pictures. We bake into JPEGS to store all of these bits of information.

Per: Uh-hmm.

James: When that was added to Exif. It was added in version two.

Per: Which was a long time ago.

James: It was a long time ago. It was — it was rubber stamped in November 1997. Now, I’d hazzard a guess that — there’s going to be some people that are listening who possibly weren’t even born. Well, definitely wasn’t born was the iPhone. What definitely wasn’t born was camera phones. I mean I got my first camera phone I think it was 2002 and I was — it was — it was really odd. People were kind of really, really curious what was going on with…

Per: And the quality of the pictures were so bad they didn’t really use them.

James: First digital camera, I mean me and you got them around ’98, ’99. So around this time we didn’t have a world where huge swaves of people were carrying around cameras in their pockets with built-in GPS locators. So personal computer is in their pocket, in their palms of their hands to take photos and videos and transmit them instantly across the world. It didn’t exist back then. It was still science fiction stuff. But they included this little line in the metadata description that allows you to store that location data.

Per: And to them it probably made a lot of sense because it can be used for so many good things because it’s just data about a photo so you can use it for searching for photos and placing photos on the map, so it’s.

James: Yeah. And back in ’97 you would have — you would have basically had to transfer your pictures — if you did have a digital camera. You’re more likely to have had a film camera and have digitized the pictures from the film. You would have imported them into some kind of tool and then you were adding the metadata to the pictures post scanning, post input.

Per: Very consciously.

James: Yeah. I used to back then — I used to get a CD from the film shop where you get the films developed, a photo shop, and then I take that CD and I — load them into another tool. But — you would have to put purposefully add the data so you would have to say, Stockholm and put in data for Stockholm. That is very different of how it is now.

Per: Yeah.

James: So — well, what this made me think about was how — we got the — okay, we’ve got the nasty side of the web bullying — the internet bullying and the online hate side of Dick Pics which is terrible in itself. But here we have a situation where some people were probably really enthusiastic and really cared a lot about an image format. So Exif attached to JPEG they had the really good idea of the location and stuff. How could they possibly know the future?

Per: They couldn’t.

James: Not fully.

Per: Of course if they were time travellers.

James: They can — they can guess aspects of the future. But they didn’t know. So then we have an example of how you create something with good intentions. And time goes on and suddenly something which had good intentions can be remixed, remashed, and used in ways which really don’t like. I think you’ve got the example the cat picks website.

Per: Yes. iknowwhereyourcatlives.com and that’s a great example and I used it in my post when I was writing about the Dick Pic locator because I could potentially have done that with Dick Pics, so what this guys has done is actually.

[crosstalk]

Per: He’s actually just downloaded a lots of pictures of cats from the internet with GPS data but also from Instagram where some people actually tell you where they post from.

James: And Twitter as well of course.

Per: Yes. Exactly. Any places all the cats on a map so you can see what cat belongs to which house.

James: And you can zoom into street level on Google Maps and you can see of course, because the GPS data, exactly which building that cat lives at.

Per: And now as you’re listening and imagine doing that with penises.

James: Uh-hmm.

Per: Could have possibly have done that with my service.

James: Yeah. And well, you can do it with any photo that you theme it and so on. That’s the kind of thing where — If you sit there when designing, we’re oblivious to some of the things and ways in which your creations can be used in the future. Now, the chat you’re having was about the how often this happens. Many occasions we get the — the things that get publicized are when the — the oblivious designs we create are actually security holes. So we’ve created something in a way which is actually — we made some mistakes. We left some — we left some gaps, so then hackers can come in and take advantage of that. We didn’t — I don’t think we leave these holes in deliberately. It’s more we’re just haven’t been aware that they exist.

Per: Exactly. Yeah.

James: And so security is often the main area of “oblivious design” but it doesn’t necessarily have to be limited to security. I mean it can be that you weren’t really fully aware of a — the a cognitive bias. That also would mean that your design maybe gets used or consumed in a way which you didn’t really expect.

Per: I mean anything with the internet nowadays — I mean anything you do, it can scale so fast, to so many people, so if you think that within this context, within this group of people, it’s going to be fine — but as soon as somebody copies that and spreads it to a billion people you never know what’s going to happen.

James: There can be some really cool and good stuff. We’ve had some great mash-ups over the years of various services. So it doesn’t mean say that everything that we create or the oblivious aspects don’t have to be bad.

Per: Well, it’s — we’re creating tools. All tools can be used for good or evil — essentially. I mean and you can’t predict how other people are going to be using your tool.

James: I think that’s the big key thing here is that — I don’t think — so the number of specific scenarios and situations that we design for that’s always going to be very finite. We’re going to know that we are designing for three scenarios, five tasks, one goal. I mean these things are very concrete numbers whereas when we start to look at the permutations and combinations of technology, of design, of behavioural psychology, the future, these becomes an — it rapidly heads towards infinity with the number of combinations that exist that were oblivious to.

Per: And it’s becoming so complex. What really bothers me is that was a conscious decision to actually add GPS data to your photo back in the day. But now it’s unconscious because people are not even aware that it’s happening and so many things people are not aware is happening are happening. So it’s just — nobody is taking responsibility of informing people “ think about this before posting photos online”, “Think about this before emailing”, “Think about this before giving away your IP address” or answering that box that pops up on different websites. I want to know your location and you just click yes or no. A lot of people click yes without thinking

James: and some of these scenarios that we’re aware of but aren’t the ones that were specifically design for. We get — we say they’re edge cases and we talked about that in previous episodes how — with Eric Meyer — that the edge cases themselves are not a good thing because it’s ignoring something real that’s happenning to real people. Where as here it’s one step further. It’s not just an edge case. This is an oblivious case.

Per: Exactly. And you intent is good. I can — an example would be just adding a Facebook like button to your website. Yes, I added a Facebook like button. What does that mean? It means that Facebook can track me across the web all the way to that website because you’re basically adding a tracker to your website which affects all your users of your website.

James: Giving the date to another — to a third party that you aren’t party to.

Per: Uh-hmm.

James: So this lead us to what can we do to counter “oblivious design” to help us with that, we’re going to have a little chat too Adam Scott.

Per: So Adam Scott is a web developer, educator, and author who has written a series of e-books on the topic of ethical web development. The most recent of which is collaborative web development.

[Music]

James: You’ve written them for a bit about ethical web development. Just tell us a little bit about that and what how you got into it and what you’ve been doing.

Adam Scott: Sure. Here in the US, I worked actually for a federal agency a few years ago, a new federal agency started up out of the financial crisis in the USA and I took a position with that agency and I realize that what we’re doing was really building things in the public trust and it got me really interested in how the decisions we were making all the time could be impactful for a wide breadth of citizens, so I wasn’t really targeting a specific demographic or demographic suddenly the entire country which ranges — in needs of our services. And as I was doing that it really made me think about how the decisions we make are impactful on those users, so by choosing a certain technology we could be excluding someone, by installing tracking software we could be implying that the government is tracking someone in certain ways.

So really just kind of lead me to think about all these things and how they really often an ethical decision and how it’s easy to make decisions just in the moment that maybe aren’t in the user’s best interest because they’re simple from a technology perspective or the path of least resistance. And so the thing about ethics in terms of like a process of web development became something that I was just really fascinated with. And a lot of other industries have these codes of ethics that we kind of lack in the digital — in the digital world even though we’re making important decisions for people.

James: Yeah. And this brings us quite neatly and nicely into the question I want to pose to you. How do we counter this kind of problem or this issue — not problem — the issue of oblivious design when we — when we have an almost infinite number of scenarios that exist for the things that we create, whether it’s through ethical decisions or wherever it’s from decisions we just haven’t been aware, of or aren’t invented yet? What can we do?

Adam: Yeah. It’s — I mean it’s certainly a challenge. I think — I like to think of it as a process. Often times we’re making thinking about these things at a certain point as we’re building a tool, and rather than kind of taking a step back and thinking holistically about the picture — the whole picture and thinking about our user base and like building from there.

So from my perspective, they often — it’s really aligns with user experience from a development perspective when I think about the terms of ethical design. It’s really taking a user centric approach to the development and thinking about our users as people, not just as numbers on the — in the analytic ticker.

James: Yeah. Creating that empathy for the fact that it’s a real person at the other end.

Adam: Yes. Yeah. I think empathy is a huge usually impactful emotion and as we could build it really helps us build things that are — that have our users best interest in mind.

Per: I think you have to have a pretty good and vivid imagination though to imagine all the different scenarios that could happen based on the design decisions you make and what could happen 10 years from now.

Adam: Yeah. That’s for sure. I mean we don’t know where technology can be at a year from now let alone 10 years from now. So it kind of taking and evolving approach too is pretty important I think.

Per: Because as you say I mean it’s about thinking about the outcomes what could happen potentially and doing an analysis but then — at some point in the future something will happen and you realized, I didn’t intend that to happen. How can you backtrack on design decisions that you’ve made before?

Adam: Yeah. I mean that’s a challenging thing to do. I think there’s no way to guarantee how we intend our technology to be used as how it’s going to always be used. Certainly, we see that with a lot of social platform. They maybe have a really good intention from the outset but, well, they can be used to sort of fuel really terrible actions in people. So I think, I really think it’s — that’s a — just a difficult thing that our industry is going to face but I do think that kind of — if we take a step back, if we think about what our — what our intentions are, how we’re building things for users, it really can help us — can help us do a better job that. So if we’re not thinking of people as dollar signs and we’re thinking about solving problems that can really — I think that can be impactful too.

Per: Yeah. So I have to ask you, what was your reaction when we sent to the link to the Dick Pic Locator?

Adam: I mean my reaction at the title is pretty — it certainly grabbed my attention. As a dug in, I can’t comment on if it’s solving the problem as intended but I think that there’s a lot of really — brought a really some interesting ideas to the table.

So this idea of awareness certainly is pretty great. Also, just making some really conscious design decisions that were user centric — looking at the site and going through the article thinking about, do you really trust me if you’re about to upload this photo? Kind of intentionally putting that into the design, so that — so that’s causing people to kind of pause and think about what they’re doing with their — with their data before they upload it to the site.

James: Exactly. Yeah. I mean in Per’s case with the service, it’s a box that you click on and then you upload a personal photograph — a very, very personal photograph in this example and — yeah, Per then subtitles that with “do you really trust me with your dick pics?”. yeah. It’s a excellent way of throwing out there.

Per: So that’s an interesting thing — So from a process perspective, like you’re saying — because that’s something I tend to say a lot that we need to add more friction because in userability people always say, we need to remove friction to make it easier but when we make stuff easier people make bad decisions as well. So is friction part of the process that you see when working with this?

Adam: Yeah. I think it’s introducing mindfulness to the process a little bit. Just raising awareness. I think a great example is Medium does this thing where they — if you have Do Not Track enabled they will not embed certain videos or do certain things and they’ll show that on the sites so it’ll say — you have Do Not Track enabled. We can’t show this because it involves Google’s tracking script. So I think that’s a great thing, but it’s sort of presenting that after the fact so it would be nice if there were some way that as a user I was aware of that tracking without kind of consciously asking for it to be turned off, so I think — I think there’s certainly a little more friction that could be introduced to these types of things that could empower users to make decisions that are in their best interest.

James: Uh-hmm. Yeah. Oh, thinking about the — going back to the pictures themselves, we mentioned about how certain — how so many services strip the metadata from the pictures and we thought — we thought about, well, how many times is that a conscious decision and how many times is that somebody — some guy or girl in that — in that kind of sprint realized they needed to make kind of thumbnails of pictures so they grab the tool to generate thumbnails and they got stripped.

Adam: Right. I mean that’s the easy — that’s again that path of least resistance, right? Choosing the easiest tool and that’s a wonderful thing. It’s one of the empowering things about open source software, about building on each other’s work. It’s a really a great thing but it also means that sometimes we’re making decisions unknowingly about what happens with our — with our tools. I think that’s a — that’s a great example. I could just be pulling in a library for an image and it automatically does this stripping of metadata — without me being aware of it and it’s just solving the problem. It’s the easiest thing I can use to solve that problem.

James: Within security I guess there we’ve seen the advantage of openness for finding security flaw. I mean the kind of the way that we find something and then we share it to say “look, we found a flaw” and the openness and sharing knowledge gets things fixed quicker. Is that maybe part of the key of how we can reduce — thinking about all these other aspects as well — If we have more openness around what we’re doing, can that help?

Adam: I certainly think it could raise awareness. Right now, we are sort of limited in the tools for how we as a user or empowered to request these types of changes, so if I see something like that and I’d like to have a changed — there’s not a clear path to me requesting that from the — from the company or raising awareness or in the broader context. We’ve seen things like open letters maybe take off but that’s a pretty rare occurrence, so certainly I think there is — there is probably some interesting ideas from the world of security like we brought into those.

Per: But in your experience, how experience are companies with thinking about these issues? A simple example HTTPS. Not all companies have that enabled. Do — is that a conscious decision do you think?

Adam: It’s really interesting because a lot of times as a — as a developer, as a designer, you’re not always making product decisions. It might be a product donor. You have a sprint goal. That doesn’t fit in, that falls down the — maybe it’s sitting on a backlog somewhere, oh, HTTPS would be really nice to have but we know we have to do all these other work. We have to play it over the wall to our systems administrators. That’s going to be a lot of back, so it’s easy to take on this other feature task and kind of sit on some of those things because they’re not necessarily driving business indicators, right? So that might — HTTPS maybe would. But not necessarily, depending on the site, so it’s easy to kind of kick that sort of thing down the road a little bit as a — as a nice to have especially when you’re working with an existing application that you’re sort of building and building on it and making improvements on it and you’ve got months, years of technical decisions that have already made that you’re sort of force to live with and adapt to.

Per: Because that’s — I mean if you go out like you do research like you’re supposed to and you’re going out and talk to users, they’re not by themselves going to say that, yeah, I’m — I wouldn’t want to visit a website if there wasn’t HTTPS but that of course it’s because they’re unaware of what could happen if they submit data to that website if there isn’t HTTPS. So it’s so hard to of course build that into the backlog. It’s not — no user has expressed that need, so who is responsible for expressing the need for them?

Adam: Right. It’s an interesting dynamic because it’s — as technologists, we’re the ones that are aware of these things more often than not especially if we’re engaged in the sort of world of building things that are — that are more ethically center or more mindful but we’re not always the ones making the product decisions or so it becomes a real challenge then.

James: And so I’ve been thinking about — well I’ve been reading up about some aspects to do with this. One of the examples about privacy and personal data is the — is the use of — that some services when they include your username in the URL for your profile and how that can be uses in as a proxy for tracking because anyone and with Google Analytics or anyone with data will see your username and be able to link it back, right? So that’s in something that you need to be aware when you’re — when you’re building these products. Is that something that — I mean I should maybe keeping mind as designer or is that something that I have to rely on developers — or it’s a real kind of grayzone — is it technology or is it privacy, is it ethics, is it design?

Adam: Tracking is a really interesting example. I mean it’s so easy to track users not just on your own site but around the web and especially if you’re — you can embed a cookie that will interact with an ad you have on another site, or embed a social sharing widget and now I know not just what you’ve done on my site but I know what the other sites that you visited and that’s how we get those scary ads where you shop for a new bed linens and then every site you go on has an ad for bed linens and it’s sort of becomes this — you start to wonder what’s happening and it’s really just that being track around the web, so it’s not even just tracking on your own application. You can be making decisions that influence how users are being tracked outside of your own site.

James: Yeah. I know that the companies that tracking data they now are up in the high 90s for accuracy of cross-device tracking.

Adam: Oh, yeah.

James: And you don’t — you don’t cookies for that now really to get that kind of level because it just so much data out there.

Adam: Right. Yeah. It’s crazy. There’s a — there’s a great. I think the EFF has a great tool. I think it’s called Panoptoclick, but they can show how easily your browser can be fingerprinted based on a whole number of different things. So really if you’re running a browser with javascript enabled — which we all are — It’s easy to fingerprint you all across the web and just based on that add-ons you have and all sorts of — all sorts of different things that are unique to your browser.

James: Yeah. Incognito mode doesn’t actually save you.

Adam: Right. No, sometimes that can be worst because it’s like identifying you more.

Per: That’s the incognito guy with a battery level of 97.77%. Oh, dear. That’s — oh, that’s awful. Well, that’s the point as well that I want to make aswell. Yes, you have GPS data and photos and that’s sort of scary but you can be tracked in so many different ways that people are unaware of. This is just — if people aren’t even aware of that little thing then how can they ever become aware of all these other things that are happening with their personal data online?

Adam: Right. And that — particularly since that’s a pretty obvious example like they — our phones will show us our GPS coordinates. It does usually an app that will say, okay, these are all the photos you took here, here, and here. So it’s even being kind of flaunted to us that capability but certainly many of these other things are not being shared with us. It’s just sort of being done and it’s assumed that we’re okay with — we’ve clicked except on the terms of service and it’s okay to share all of our information and that’s the cost that we’re often paying for the use of free services and applications.

James: I think because it’s so complicated and because it’s so difficult to predict the future perhaps the — one of the best things we can do is what we’re doing now — or what you have done Per — with we’re lifting things to the surface. We’re talking about it. The open dialogue.

Adam: Yeah. I fully agree. I think the open dialogue. I think adding that friction when possible I think is a great — a great way of that. So making it, building it into our sites and applications and hoping to raise awareness to our users that these are the decisions that they’re making with their data when they use our site.

Per: Which will hopefully give trust to the right people. Thank you so much for joining is Adam. This was excellent.

Adam: Thank you so much for having me.

[Music]

James: So we just can’t away from the fact that too many things are unknown — or we’re oblivious to them. And even if we are as ethical as we can be and as thoughtful as we can be in so many situations. We don’t know the future. We don’t know how things are going to be — I don’t know — Bastardised or combined in amazing new ways when we sit down there and design.

Per: Sure. Yeah. And — exactly. There’s no way of knowing. It’s impossible.

James: Not everything.

Per: Yeah. But there’s also — I mean nobody is even held accountable, so there is no real incentive to do anything because nobody is going to, no, you shouldn’t have added those Exif data to that file and we’re going to take it to court. That’s not going to happen and nobody is even going to say that we’ve made a bad decision.

James: No. So you’ve got to be — so effectively it’s like kind of keeping healthy, keeping fit a little bit. That we got to be constantly talking about the things that we haven’t designed for, or constantly considering them to keep our minds fit about the topic and keep us aware of that fact that other stuff happens, not just the stuff we specifically design for.

Per: Yeah.

James: So how about — how about the suggestion that we — you periodically have little get together with your team or your stakeholders or organization or whatever. Or even users or even people taking advantage of your service or buying your product. And get people to think about — throw their minds into the future and ask, how could this be used? What could you do with this in 10 years’ time? In 25 years’ time, 50 years’ time?

Per: And also if we — I mean that’s interesting — You could think about if I wanted to use this for evil, how could I?

James: Yeah. So you could get — you kind of role play this, so it doesn’t have to be connected to what you’re developing now. It’s more kind of like a keep fit exercise for your brain and your awareness and imagination about what you’re doing, what you’re creating. Some things might come up that might make you “go, oh, yeah, no, you’re right. That could actually be something that happens. Maybe we should just tweak this here now”. Because there are also so many simple things — simple decisions you’re going to make here and now, design decisions or even development decisions which perhaps make something easier or harder — and that would be a good thing ethically or privacy wise or even just health of the product. So by having that kind of moment where you can say, look, right, we are going to futuristic now. Here’s our world. What could it be doing in 25 years’ time?

Per: And the — I mean — how would you get people to set aside time to do that and now just you and me thinking about it, it’s actually — it sounds quite like fun because it’s like storytelling and that’s the type of storytelling you could also use when reaching out to people and creating interest for your product. So I kind of see there’s a value in that but you need to help people to realize the value of it.

James: Yeah. But I mean you don’t have to make into a full day workshop. I mean you could — I reckon you could do this just with — you could do this with an hour. You can just sit down. I mean you could even do it over lunch.

Per: Yeah. But how often?

James: But that’s a whole different thing. That depends on your pace of change.

Per: Hmm.

James: I mean if you — if you don’t.

Per: The pace of change of the world around you.

James: Exactly, both aspects. How much you were changing things and how much out there is changing.

Per: Yeah. So if more companies had sat down down when the iPhone came out and said “look at this” they probably could have seen some interesting things happening.

James: Yeah. Well, you didn’t necessarily have to look at the iPhone, you like sit down and go — because now you got that seed in your head about a world with Star Trek communicators in your hand which soon become reality over 10 years then — yeah, you could sit down there and play with the ideas. How would the world be from our products’ viewpoint or our services’ viewpoint? What is it going to be like?

Per: And to be fair a lot of these is happening but specifically often around the world of self-driving cars. People are thinking about the ethical aspects of that but perhaps everyday website.

James: Yeah. Yeah. It need sometimes I think with the e-commerce and the warehouses delivery, drones there’s an aspect to it there as well. But I think you could definitely make a little fun — a fun meeting or something out of this to help — or at least just to help talk more and help spread awareness about being aware which is kind of back to the beginning of the Dick Pic Locator. It wasn’t really built to reveal where somebody’s taken that picture. It was there to raise awareness of the fact that this data is being spread around and we made decisions, other people have made decisions that have led to that.

Per: But also I think actually to fight that phenomenon of Dick Pics by having the perpetrators realize that the data is there. Because even if the receiver of the photo doesn’t have the data somebody else does.

James: Yeah. We’re oblivious to everything. But a little less oblivious now.

Per: Yeah.

James: So show notes and links from this episode are available at Uxpodcast.com. If you’re not already a subscriber then please just add us wherever you’re listening to us now. If you are a subscriber, tell a friend or a colleague about the show. Thank you for taking the time to listen.

Per: Remember to keep moving.

James: See you on the other side.

[Music]

James: Knock, knock.

Per: Who’s there?

James: Atch.

Per: Atch, who?

James: Bless you.


This is a transcript of a conversation between James Royal-Lawson, Per Axbom and Adam Scott recorded for UX Podcast in August 2017. To hear more episodes visit uxpodcast.com or search for UX Podcast in your favourite podcast client.