Watch Yonder CEO Jonathon Morgan on Machine Meets World

Yonder CEO Jonathon Morgan on Truth and “The Algo”

Join Machine Meets World, Infinia ML’s ongoing conversation about AI

James Kotecki
Sep 1 · 24 min read

Episode Highlights

This week’s guest is Yonder CEO Jonathon Morgan.

“You’ll see it in the comments of a TikTok video like, ‘Oh yeah, I’m liking this because I want the algo . . . I want the algorithm to surface more content like this.’ They’re actively engaging in a dialogue with the systems that curate content on their behalf.”

“I think ultimately that comes down to humans. This always sounds so nihilist and dystopian. But I think there’s a small subset of things that are objectively true and objectively false — they’re just the facts. But so often, I think a partisan political, or any type of partisan interpretation of those facts is likely to be so different that the facts themselves are obscured.”

“…what most algorithms have inadvertently discovered is that people respond to certain types of content better than others. And it turns out that as human beings, we respond better to content that reinforces our point of view.”

Watch the show above. You can also hear Machine Meets World as a podcast, join the email list, and contact the show.

Audio + Transcript

James Kotecki:
Live from Infinia ML, this is Machine Meets World. I am James Kotecki, joined by my very special guest, Yonder CEO and founder, Jonathon Morgan. Let’s talk artificial intelligence. Hi Jonathon.

Jonathon Morgan:
Hey James. Thanks for having me.

James Kotecki:
Well, thanks for being here. So I go to the Yonder website, and in huge bold letters, it says, the internet is alive. What does that mean?

Jonathon Morgan:
So a premise of the company is that there’s… I think we think of the internet like it’s just a… it’s a big crowd. Like, it’s a bunch of people, every account is one person, and we’re all making like little decisions, taking small actions, contributing in a small way our voice to like a larger conversation, and then kind of the wisdom of the crowd sort of figures out what’s best and then surfaces that to everybody else, and that’s kind of how the internet works. From our point of view, it’s actually much more important to focus on groups like the… Especially in the modern social internet, everybody’s been herded into kind of like-minded groups.

Jonathon Morgan:
In fact, the groups that we participate in are almost like organisms. Like, they have their own subcultures, their own kind of behaviors, their own idiosyncrasies, their own passions and their own point of view, and the people kind of participating in these small communities and mobilizing in these small communities is really what makes the internet like an organism and it’s how ideas move from one group to another and how we basically like shape our point of view online. So we think about it much more like that. Like, it’s much more dynamic and much more organic than I think originally people… like we used to think about it.

James Kotecki:
So is it related to the concept of, you’re looking at ant colonies on a plane, rather than the individual ants?

Jonathon Morgan:
Yeah. I think that’s a great way to think about it.

James Kotecki:
Is this related to the concept that I often hear on certain corners of the internet, where someone says something like, “The internet is talking about this crazy new video.” That’s what a newscaster would say. It always seems like kind of a hamfisted way of describing it. Because what I perceive of as the internet is based on my own sociopolitical, cultural decisions about who I’m following and what that is. My experience of the internet is totally different than someone else’s experience of the internet. So is it related to the concept of the internet, as like a phrase people just throw out there?

Jonathon Morgan:
Yeah. I think that’s exactly right. I think what you just said is a very sophisticated understanding of the internet. What’s fascinating, by the way, is that like people about 20 years younger than us, who are like kind of Gen Z, if they’re using the different social platforms, they recognize this explicitly. Like, they’ll say things like… You’ll see it in the comments of a TikTok video like, “Oh yeah, I’m liking this because I want the algo… I want the algorithm to surface more content like this.” Like, they’re actively engaging in a dialogue with the systems that curate content on their behalf. I think that dynamic is what creates things like, whatever you want to call them; filter bubbles, et cetera, et cetera. We all this own like kind of semi-curated, either implicitly or explicitly curated subjective experience of the internet, and it is different for every single user, to a certain extent. So I think that’s a great way to think about it.

James Kotecki:
So is that what Gen Z is calling algorithms now, the algo? Do I need to get onto that language [crosstalk 00:03:17] Gen Z?

Jonathon Morgan:
Yeah, it’s like they’re friends. They understand how to interface with the machine, which is both fascinating, and then… I can’t decide if it’s a dystopia or if it’s actually like a great step forward.

James Kotecki:
But it’s certainly an acceptance of the reality. So let’s get into the algorithm component of this, and specifically… So the internet is alive, it’s ant colonies, and we’re studying it at that level, what is your company specifically doing to play in that space?

Jonathon Morgan:
So we want to make sure that we understand the ant colonies and their dynamics and their shape and their influence, because for us, those were the building blocks. Those are kind of the foundational components by which ideas spread online. So if you care about how ideas spread online, then you should care about these ant colonies. So we have a name for them, we call them factions. So if you care about factions, if you care about how ideas spread online, you have to care about factions, and that’s, all of our technology is designed kind of around that basic premise, can we find factions, can we kind of measure [inaudible 00:04:16] influence, can we understand their behaviors, can we understand the impact that they have on social conversations? Kind of, what’s their point of view, what are their passions, et cetera, et cetera, et cetera.

James Kotecki:
So give me an example of a faction that isn’t obvious. I assume like political parties might be factioned, maybe subgroups within political parties, but just give me an example of a faction.

Jonathon Morgan:
Oh, sure. It can really be everything. We’ve seen examples of people who stay on a particular type of music. So like, K-pop stan, for example, are a very highly mobilized group online. Not just-

James Kotecki:
For those older than Generation Y or the Millennial generation, we should say that stan means they like it, they’re fans of that music.

Jonathon Morgan:
That’s right, they’re like super fans. It’s like-

James Kotecki:
Okay.

Jonathon Morgan:
I think it relates to an Eminem song. I’m actually a little bit too old to know why we say the word Stan, but I accept that it’s a thing. I’m on the rollercoaster with everybody else.

James Kotecki:
Okay, we’ll go with it.

Jonathon Morgan:
We’ll go with it. K-pop stan is a faction that we track. The reason we call them that, or the reason we label them as a faction is that they, A, have like a similar interest and are kind of hyper passionate about a specific thing, and then they mobilize, and that they intentionally coordinate their activity to pursue an agenda online, for better, for worse.

Jonathon Morgan:
So one of the ways that we’ve seen K-pop stans have this kind of big impact on national conversations is by a technique called hashtag hijacking. This is something that I used to study as a researcher about extremism back when groups like ISIS were engaging in kind of like black hat social media techniques to overwhelm a social media conversation by taking a hashtag that was super popular, like about a world cup game or something, and then posting on that hashtag, in high volume, content that had nothing to do with the hashtag itself. So people trying to follow a particular thread of social conversation online were getting this content injected into their feed, and that’s exactly what K-pop stans are doing.

Jonathon Morgan:
But instead of extremism and radicalization, they’re pursuing kind of like a social agenda. Like, they have a point of view about how the world should be, and so it’s usually kind of in direct opposition to like a White nationalist group, or something that they feel like they’re the good guys and the other team is the bad guys, and they’re kind of playing this game online in order to shape the conversation, have influence and shape people’s point of view, ultimately, engage in activism.

James Kotecki:
So if you were out there trying to understand what kind of stans are going to have an impact… what kind of fandoms or what kind of factions are going to have an impact, you might not have thought about that connection in advance, right? You wouldn’t have thought, “Okay, there’s going to be some White nationalists out there, and they’re going to be disrupted by the K-pop stans.” You wouldn’t have put those things together. I guess, maybe it seems obvious in retrospect, but a lot of people wouldn’t have understood that. Is your company in the business of trying to predict or understand these factions better, to know in advance or understand in advance, how they could behave in maybe counterintuitive ways?

Jonathon Morgan:
That’s exactly right. Because ultimately, if you are… I think what’s most important for people who are our participants, like active… as part of their job, participants in the public discourse. So this is everybody from like PR and communications for big companies, I assume like kind of policymakers, and anybody who kind of needs to understand public opinion that has some interest in shaping it, like being part of the discourse. It’s always better to have some advanced knowledge of kind of, where is public opinion going? Who’s shaping public opinion? Who are the tastemakers? Who’s kind of in support? Who shares my values and is advocating for the same things that I would advocate for as like a purpose driven brand, for example? Or, who’s adversarial to my point of view, for one reason or another, because of how they perceive the brand’s position on social issues that are important to them?

Jonathon Morgan:
I think it’s interesting that you brought up political factions earlier, because, of course, that’s a thing. Like, that’s definitely something that politicians have to do, is shape public opinion and get people on their side. But increasingly, everybody has this responsibility. Brands do, in particular, because they all want to be purpose-driven, they all want to be brands with values, they all want to connect authentically with consumers, and so they get caught up in a lot of these very heated conversations online that often bring out these factions that are explicitly coordinating their activity to pursue agendas. Brands get kind of caught in the middle of it, and they have to be able to predict what’s likely to happen as they weigh into important social conversations.

James Kotecki:
This is an aside, but if I want to be an influential person on the internet, do I need to think less about my personal brand and more about the faction that I’m leading? Because you might think about the old school model of like, I’m going to create a YouTube channel and become personally famous, and everyone’s going to love me. Maybe I’m more effective if I can be a Lieutenant in the K-pop movement.

Jonathon Morgan:
Well, I’d actually argue that that original idea, that I would just… that you’d be a content creator online, and that people would naturally gravitate towards your content and follow them, has actually never been the way that people have built authentic communities online. My history with this goes all the way back. This is embarrassing, but like pre-YouTube and pre-Twitter, when the main way that people were organizing socially online was through blogs, kind of old school, personal diary style blogs.

Jonathon Morgan:
What was interesting is that the people who had a really large following, really large readership were people who were actively cultivating and engaging a community. Like, they were finding very engaged people who really resonated with the material, and they were building authentic relationships with them. And then they would have people who would advocate on their behalf online. They were kind of hyper-engaged and hyper-interested.

Jonathon Morgan:
I think that anybody who’s built an audience from scratch, whether it’s on YouTube or even other platforms like TikTok or Vine, when that was a thing, or Instagram, they’re people who really can mobilize a group of people to take some action, whether it’s participate in an online campaign or go buy something on behalf of a sponsor. Whatever kind of influence and value they have online, it’s because they’ve actively built a community. So they’ve, in fact, formed a faction around themselves, and that’s why they’re influential. And then some kind of transcend that, they have platforms, like kind of mainstream media figures. But at the beginning, most people are basically cultivating a faction, and that’s how they have that online success.

James Kotecki:
So let’s get back into how AI actually plays into the way that you’re able to understand this. Obviously, this is all algorithmically driven on the social networking side in terms of what they’re actually surfacing to other users. But a lot of that, I think, to many people, might seem like a bit of a black box, I’m going to do the best I can to try and manipulate the algorithm in my favor, but they’re so complex and they’re kind of always changing, and they’re really hard to understand. I assume that part of your company’s mission is to try to make that a little bit more transparent for your clients.

Jonathon Morgan:
Yeah, that’s exactly right. I agree, I think that the algorithms that social media companies use to try and surface content that’s both personalized and likely to be interesting or engaging for any given user are incredibly sophisticated, and I should point out, are well-intentioned. I think the thing driving most social media platforms to curate this content is they’re creating what they believe to be a better user experience, and they are just byproducts of that at scale.

Jonathon Morgan:
But I think the thing that underpins all of the techniques they use is the fundamental belief in the “the wisdom of the crowd”. So if people like me like stuff like this, then I’ll probably like it too. I think that’s the thing that ultimately creates this kind of curated, subjective social media experience. So for us, I think it’s important to identify, not necessarily the mechanics of distribution online. I think those are kind of baked at the end of the day. For us, it’s important to say, for a given topic of conversation for a given idea or set of ideas, which groups have organized around them and are therefore influential in like setting the terms of the public conversation?

Jonathon Morgan:
I don’t know if you’ve ever heard the expression that like, he who is offended first on Twitter wins. There’s kind of that idea for like most of the conversations that happen online, like the group that’s most passionate about it is the one that’s most engaged, the one that posts about it the most often, the one that kind of drives the most volume. And then because of their passion and their mobilization and their coordination, they ultimately kind of set the terms for the conversation publicly, and that’s how they have a lot of influence.

James Kotecki:
You notably said, he who is first defended, not he who is first delighted or excited about it. Right? There’s a-

Jonathon Morgan:
Yeah.

James Kotecki:
… negativity, certainly, that is easier to tap into a lot of times than a positivity. I’m not even sure how to ask this question, other than just to throw up my hands and state, the state of the world is crazy, and I have a feeling social media is part of it because of everything that’s happened from 2016, and before that, obviously, but certainly made visible in 2016 and on through today. In politics and so many other domains, social media feels a bit like a toxic wasteland at times. What do you say to that?

Jonathon Morgan:
I think in a lot of ways, it does. Also, I don’t know how else to get at it. I think I should point out that I have a real fondness for online culture and social media, for no other reason that… I think for a lot of people, at least of a certain age… Like, I was around in the kind of naive early days of the social internet, and a lot of my community was formed online, a lot of like who I was, how I saw myself as a person was shaped by the really kind of warm, positive, human-to-human authentic interactions that I had on the social internet.

Jonathon Morgan:
I think we’ve seen it be very powerful in a number of ways that I think are easy to forget. Because another maybe a byproduct of social media is that our memories are all very short. But if we can look back even 10 years, I think there have been multiple times when the collective voice or a movement was able to be coordinated at a much larger scale due to the ease of communication and the ease of discovery on social media, and that people were able to effect change as a collective much more easily than they would have been able to prior to social media.

Jonathon Morgan:
So I think there’s a lot of positives, but I also think that because of the kind of polarized dynamics, one of the byproducts of creating this subjective hyper personal experience is that I think what most algorithms have inadvertently discovered is that people respond to certain types of content better than others, and it turns out that as human beings, we respond better to content that reinforces our point of view. So the content that reinforces your point of view over and over and over again and more is more and more engaging tends to present to you a more extreme point of view than the one that you had before, because the idea is to kind of escalate your engagement in a way that is kind of pleasing or reinforcing.

Jonathon Morgan:
In a black box, it doesn’t think about the subjective quality of that content or the subjective quality of the kind of emotional response that it’s provoking, you can end up with dynamics that feel, frankly, pretty similar to the dynamics of radicalization, like closed information spaces, kind of shifting [inaudible 00:15:05] windows, things that are… in a lot of ways are benign. Like, I don’t think anybody’s worried about people being extreme K-pop stans, but in a lot of ways, I think it’s been very disruptive to our public discourse, especially in how we discuss what can be divisive, but also really essential social issues about the way that we organize ourselves and the way that we govern ourselves and the way that we relate to each other as citizens. So I think that’s a… it can make for a very toxic environment. I think that that’s true, people are really calling into question how valuable this is to our public discourse.

James Kotecki:
Speaking of public discourse, and I think you mentioned the idea of how we govern ourselves, right now, it’s largely self-governed by the social networks themselves, as far as how they create these algorithms. Even though they have tremendous influence, there’s not necessarily an overarching regulatory set of issues here that will be governing that. Do the companies themselves understand this to a level that they’re trying to make changes to kind of benefit us? Obviously, there’s tremendous pull for them to just optimize it for monetization and just more people spending more time on their site, clicking on more ads. But at a certain level, when they get big enough and they’re controlling certain segments of how we feel as a society, to a certain extent, certainly, most of… There must be people inside these companies that are trying to raise kind of red flags about ethical issues here.

Jonathon Morgan:
A hundred percent, there are, and I think there always have been. I know that sometimes the speaking of kind of simplistic, sometimes toxic discourse, I think the way that people have discussed the roles and responsibilities of the social media platforms has been fairly binary. Either it’s big tech, and they’re kind of evil corporations who have kind of designs on society or just don’t care in order to continue pursuing profits. But I think it’s a much more nuanced view. Like, it’s a much more complex situation.

Jonathon Morgan:
So I think there have always been people inside the social media companies who have recognized the potential for influencing public opinion at scale and the responsibilities that come with that. I think now, what I see is that there are kind of very large policymaking groups inside the social media platforms that are grappling with really, really difficult issues. Like, what’s our responsibility to create kind of a safe online environment? What does that even mean, to be a safe online environment? What are the trade offs between giving people access to congregate, access to share their point of view? But at the same time what if that point of view is kind of fringe or extreme, or… Do we want to provide a platform for ideas that ultimately in society we don’t agree with? I think that-

James Kotecki:
And then who is judging that, right? Like, there’s-

Jonathon Morgan:
Exactly.

James Kotecki:
… all these things that almost nobody in society would want, but then as soon as you… There was a good episode, I think, of Radiolab about this. Like, as soon as they set the rules for what they would and wouldn’t do, there came a gray area where they had to make another set of rules.

Jonathon Morgan:
That’s exactly right. I think that it’s in the gray areas where a lot of debate is happening, and a lot of debate is frankly very healthy. I think the kind of overarching question that is a question that I think we have to grapple with as a society is, do we want these decisions being made by private companies that I think are… even if we assume they’re well intentioned, they’re unelected, they aren’t accountable to the public, they’re accountable to consumers, maybe. But it’s not the same dynamic that we have in all of the other places where we make policy, all of the other ways in which we kind of decide how to govern ourselves through lawmakers.

Jonathon Morgan:
So I think that’s a… It’s kind of emerged that now kind of the de facto public square where free speech occurs is governed entirely, to your point, by private companies who can do whatever they want. So is that the dynamic that we want? Is that the most healthy way for us to manage this as a society? I think that’s a good debate to have.

James Kotecki:
Well, if you could put odds on it, what would you predict is the chance that we have some kind of sweeping legislative regulatory body in the next decade or so for something like this?

Jonathon Morgan:
I think ultimately, it’ll depend on politics. I think it’s more important to some politicians than others. So I think if… I think it’s a lower… I think a political party that feels that private companies should be left to their own devices, which I think is a reasonable way to organize, might say, “Hey, look, the market will decide, consumers will decide what type of experience they want. We don’t want to establish the policies of private companies.” I think a political party or a set of politicians who feel that it’s more… that the government has a larger role to play in regulating the behavior of private companies and how they interface with society is more likely to make this a priority and engage in some type of legislation.

James Kotecki:
It’s interesting because it’s kind of scrambled though, right? On the Republican side, you have maybe people who feel offended that these social media networks maybe have an anti-conservative bias in their minds, and so those who would be more in favor of just laws that [inaudible 00:20:12] let these companies do whatever they want may somehow also feel a pull towards regulating these companies more because they feel like they’re biased against their political views.

Jonathon Morgan:
It’s interesting that you bring it up. There was an interesting moment. So I think after the 2016 elections, the kind of like research advocacy community folks who had kind of a background in understanding these types of online dynamics were really looking at a piece of legislation called CDA 230. The gist of that legislation is that the platforms aren’t responsible for the content that their users publish, assuming that they’re acting in good faith [inaudible 00:20:44] blah, blah, blah, blah, blah, blah. So that was the argument that those groups were using to get some leverage over the social media platforms and saying, “Hey, you weren’t acting in the spirit of CDA 230, we need to revisit this in order to legislate some of the way that you govern the content on your platforms. Maybe you should be responsible for it.” What was interesting is that recently… This is so inside baseball, I feel like we’re really getting nerdy out here.

James Kotecki:
Let’s get into the [crosstalk 00:21:07].

Jonathon Morgan:
Recently, there was an executive order by the Trump administration that also used the CDA 230, because I think they were concerned about some of the things that you just noted, that they’re worried that perhaps there’s an anti-conservative bias from their point of view, and so if that’s true, they want their own leverage over the social media platforms. So like historically, kind of Liberals took that on, kind of use this obscure piece of legislation after the 2016 elections, when they felt like the kind of discourse on social media didn’t go their way. And now that the Trump administration is concerned that discourse on social media platforms might not go their way, now they’re leveraging CDA 230.

Jonathon Morgan:
I think it’s a good point, there’re some… Inevitably, because it’s politics, there’s probably going to be some political opportunism. But what I hope comes out of it is that we re-examine laws that were written for a completely different internet. At the end of the day, the internet of [inaudible 00:22:03] 2000 bears very little resemblance to the internet that we have now. So re-examining that legislation and saying, what do we want this public square to be? How do we want public discourse to operate? What is in the best interest of the users of the internet? How do we want to codify that? What rules of the road should there be? I think this is a good opportunity to do it, hopefully in a way that people can set politics aside and talk about sort of what’s best for the users of the internet.

James Kotecki:
I know that misinformation, disinformation is another thing that you’re tracking and trying to understand as it moves through the internet. Obviously, a lot of concern about that in the 2016 election, the 2018 election, there was concern about it, and now we’re going into an election season with maybe even more heightened sensitivity to the possibility of bots and other kinds of false information and manipulated information spreading online. What does your research tell you in general about what’s happening, heading into 2020?

Jonathon Morgan:
Well, I think it’s really important, as a framing, that outright this information is a very, very small part of the online discourse. It, of course, happens, and it’s important to call it out when it does and recognize it, but I think what most people… The kind of common understanding of what it means is actually just the manifest of activism online. That most of the information, most of the content, most of the behavior, a lot of the information that people would label as so-called misinformation and disinformation, I think is rightly allowed on the social media platforms, and that the disconnect is that the way in which people advocate for an agenda online is inherently misleading and inherently confusing to the rest of social media users. So I think that disconnect is where the real problem is, that misunderstanding, that lack of transparency.

Jonathon Morgan:
So, a group of social media users all gets together in a Facebook group and they say, “Hey, we want to advocate for this… our point of view on this social issue. We’re all going to use this hashtag, we’re going to all post on this time, we’re going to get up in the mentions of these three kind of middling social media influencers, and we’re going to run a campaign, because we care about this.” That coordinated activity to advocate for a point of view, to… shares all of the same dynamics as an outright disinformation campaign. But it’s real people engaging in sort of collective, so it’s almost like a protest. It really blurs the lines.

Jonathon Morgan:
So I think what we’re likely to see is very much an increased amount of that type of activism, that type of engagement. I think the social landscape is going to be littered with groups pursuing agendas, and I think what’s important is that we’re able to identify when a group is pursuing an agenda, and separate that from everybody’s mental model for the internet, which is more like, well, this is just people talking about something. So if I see the same social media post 20 times, I assume that that must be kind of a big deal. Like, a lot of people must be posting about it for me to see it 20 times. But it very well could be 20 people trying to post about it as often as they possibly can, so that it increases the likelihood that it shows up in as many feeds as possible and creates the impression of kind of a ground swell of public support when it’s actually something kind of more like astroturfing.

Jonathon Morgan:
I think that, again, this is just the way that we’ve decided to govern our public square, this is how the internet works. The modern social internet functions this way on purpose. So now we have to deal with the consequences and say like, how can we identify the difference between coordinated activity and individual activity, and just label it for what it is and be transparent about it? And then in extreme cases, be able to identify when there’s a group that is kind of covertly trying to manipulate the public through a coordinated disinformation campaign. But that’s the type of thing that I think is associated with governments, it’s associated with kind of explicit and overt kind of geopolitical influence that’s being conducted conveniently, and it’s actually kind of a rare case. I think the thing that we should all be concerned about is the more common case, which is this like coordinated, agenda-driven influence.

James Kotecki:
We were going to frame this question, but does truth play a role here? Like, can algorithms, can AI… can that help to understand when these groups are telling the truth or adhering to the facts, or not? Because on some level, those algorithms, even the most sophisticated, would have to be trained on data that was, at some level, labeled by people about what truth was. So do we have a technological solution here, with algorithms and AI, or ultimately, does it come down to humans?

Jonathon Morgan:
I think ultimately that comes down to humans. I’ll say this always sounds so nihilist and dystopian. But like, I think, again, there’s a small subset of things that are objectively true and objectively false, like, they’re just the facts. But so often, I think a partisan political, or any type of like partisan interpretation of those facts is likely to be so different that the facts themselves are obscured. This comes down to like, well, are we talking about gun control, or are we talking about gun rights? Are we talking about the right to life, or are we talking about the right to choice?

Jonathon Morgan:
Like, even in like the very basic framing of an issue, there’s kind of an implicit reshuffling of the underlying facts in almost anything that I think people feel very passionately about. So I think, well, again, there’s a small subset of things that are outright false, and I think that fact-checking organizations do their best to inform the social media platforms to remove content that is deliberately misleading. But I think for the most part, again, we should be concerned with kind of, whose agenda is driving this content, what is their partisan interpretation of the facts, what’s their point of view, what are they advocating for? And then as a consumer, I can decide like, “Well, with that additional context, what does this mean to me? What should I do with this information, given that I know what its agenda is?”

Jonathon Morgan:
Again, it’s more complicated, it’s more sophisticated, but I think that there’s no… I don’t think we get anywhere by simplifying it to like, what’s true and what’s false, because at the end of the day, the human experience just doesn’t work like that.

James Kotecki:
So the way to make it more authentic internet is not necessarily to have an algorithm determine what is true, but rather give people more context to have a more nuanced understanding of who’s saying what and for letting them making up their own decisions.

Jonathon Morgan:
Yeah. I think that’s exactly right.

James Kotecki:
Well, thank you. I appreciate it, by the way, a guest who’s willing to go to a quote “nihilist and dystopian” place in an interview. Also, you know the thing about this interview that was funny? I don’t think either of us are that old, but this conversation often felt like two old men talking about the internet. Like, remember back in the day when the internet was this, and now it’s not, and the kids are doing different things and… I guess that’s just where we’re at now, Jonathon. But I appreciate you coming on the show and sharing your insights. Jonathon Morgan, CEO, and founder of Yonder, thanks for being on Machine Meets World today.

Jonathon Morgan:
Thanks so much for having me. I really enjoyed the conversation.

James Kotecki:
And thank you so much for watching. Like, share, comment on LinkedIn or wherever you’re getting this show. Thank you so much. I’m James Kotecki. That is what happens when Machine Meets World.

Image for post
Image for post

Originally published at https://infiniaml.com on September 1, 2020.

Machine Meets World from Infinia ML

Weekly Interviews with AI Leaders

James Kotecki

Written by

VP of Marketing & Communications for Infinia ML, a machine learning company. Speaker from North Carolina to South Korea.

Machine Meets World from Infinia ML

Infinia ML’s weekly interview show with leaders in artificial intelligence.

James Kotecki

Written by

VP of Marketing & Communications for Infinia ML, a machine learning company. Speaker from North Carolina to South Korea.

Machine Meets World from Infinia ML

Infinia ML’s weekly interview show with leaders in artificial intelligence.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store