The State of Engagement, Optimization, and Dogs

sean_lukasik:
So Dr. Jen Goldbeck, thank you so much for joining the Paesanos Podcast. When I when I originally had this in mind, this whole project in mind, you're the first person that came to mind to kind of talk about some of these things that I'm really excited to dive in with you. And so I appreciate your your time and expertise.

jen_golbeck:
I'm thrilled to be here.

sean_lukasik:
Thank you. So I guess let's just jump right into it. First of all, I want to talk about algorithms and really just establish like from a computer scientist what the definition of an algorithm means before we talk more about, you know, the ways that it can, well, the ways that it permeates the world that we live in today. But let's start with the definition.

jen_golbeck:
Yeah, you know from like the basics in computer science and algorithm is really just a set of instructions You can think of it like a recipe a recipe is absolutely an algorithm And if you were an undergraduate computer science student taking your first algorithms class You'd probably spend weeks learning sorting algorithms. How do you take a list of numbers and put them in order? Which is so boring compared to like the sinister algorithms that we're all used to but but really that's all it is term is kind of used now is really referring to personalization and optimization algorithms that drive the content we see, whether it's through search engines or social media or music streaming services.

sean_lukasik:
Yeah, and that seems to be when it became like a term used widely in the culture that we live in today. And I think a lot of people think of algorithms in terms of social media specifically, because whether we're business owners or marketers or whatever, we're constantly trying to like work those algorithms, which we never really know exactly what they are. But you're someone who has a really and the goal seems to be different from, you know, site to site. But what is it that the social media algorithms are kind of trying to do for us as users when we visit a site like Facebook or Instagram or Twitter or whatever?

jen_golbeck:
Yeah, you know, there's a lot of ways to think about it, but I think the easiest way to conceptualize this and so many things on the internet, it comes down to money,

sean_lukasik:
Hmm.

jen_golbeck:
right? How does Facebook, how, or Metta now, how do any of these companies make money? They do it by selling ads. How do they get ads in front of us? I mean, one, they want to pick ads that we'll like, but two, it's interspersed with the content that we like. So these algorithms are really optimizing for engagement. the longest. It's now if it works well that's stuff that we like and want to see anyway but a lot of times it isn't a lot of times it's stuff that makes us really angry and makes us feel bad but we feel like we have to respond to it and

sean_lukasik:
विज्टिया

jen_golbeck:
like yell at that idiot on the internet and from the perspective of the social media companies that's great our engagement goes up there's more opportunities to sell us ads and that makes more money.

sean_lukasik:
Yeah, and you wrote an article called Optimizing for Engagement Can Be Harmful, and there's a better way. I always think about that idea of optimizing for engagement, getting a little out of control when people are just buried in their phones, and I don't know if that's what the intention was, but the algorithms are doing a good job at keeping people engaged. Why do you think that can be harmful? start with the negative and we'll move to the positive.

jen_golbeck:
Sure. So TikTok is a great example of where you can really get lost in there. There have been times of high anxiety where I'm like, I just need the next three hours to pass so I can get this thing. And great, I put on TikTok and the three hours disappear because my brain shuts off and just absorbs that.

sean_lukasik:
Yeah.

jen_golbeck:
So there's certainly a debate on like, is that a good thing? But I think there's much more concrete examples of the harm. And one of I think the most damning stories comes from Why do we have so much like far right, alt right, kind of extremist, fascist content on our platform? And why are people so engaged with it? And their internal group studied this and found that I think like 68% of people who joined a far right group on Facebook did so because Facebook recommended that group to them. So whatever, I'm on a kind of standard conservative page and Facebook's like, hey, you might also like extreme conservative, like sort of fascist page. get in there and there's all sorts of harms that come from that because it can normalize more extreme thoughts and that happens around the political spectrum. It can just make you feel bad and there's an awful lot of studies out there that show lots of content makes us feel bad. It creates eating disorders in adolescent girls. I'm running a study now on kind of seeing, you know, what I'm colloquially referring to as triggering content.

sean_lukasik:
Mm-hmm.

jen_golbeck:
bad health anxiety. You would have called me a hypochondriac, whatever, when that was an invoked term. And when I get ads for cancer medication on social media, I don't have cancer, but I think about it when I see those ads. And if I'm in a bad place, send me down a rabbit hole, and we see that with people who are recovering alcoholics or recovering from eating disorders, that it can be very engaging to put that sort of content in front of people, health situations for them. So there's a lot of ways where engagement and keeping our eyeballs on it, it works, but it makes us feel worse.

sean_lukasik:
And what's the solution then? So the subtitle to that article is there are alternatives. Let's just start with the social media companies. What is their responsibility to find alternatives and what can they be doing knowing that they're essentially optimized to make money from engagement?

jen_golbeck:
Yeah, this is one of those things where like there is a solution to the problem, but it costs money. And

sean_lukasik:
Mm-hmm

jen_golbeck:
basically companies will make less money if they follow those solutions. So I did a study where I was looking at, can we optimize for well being your sense of just feeling like safe and comfortable and good and happy. content and we'd measure how they felt afterwards. And we were basically able to take the same kinds of algorithms that are in use and just switch out instead of optimizing for stuff that gets people to click or stay longer or comment, let's swap in stuff that we know makes each individual person feel better. And that worked like we could build a feed where you would come to say Twitter. And if you had the for you section of Twitter, what it would show you is stuff that made you feel really want, I think all of us know that we need to see some bad stuff sometimes to whatever be informed about the news and that kind of thing. It costs money if they're showing us things that aren't getting us doing those angry comments and replies and integration. We're going to see fewer ads. That said, I think there's a model for this. I think there's a way that a company can be profitable while not maximizing their profit social good. And we've seen some places that social media companies have made those choices. Most major platforms now have decent rules about stuff that's just not allowed, right?

sean_lukasik:
Mm-hmm.

jen_golbeck:
I mean, child pornography would be a classic case, but even stuff that's not illegal, they could support that and allow it, and they choose not to, and they know it's because people don't want to be on platforms where they're inundated with negative stuff all the time.

sean_lukasik:
Mm-hmm

jen_golbeck:
I guess, but there is a technical solution that can show people things that are better for them.

sean_lukasik:
Yeah.

jen_golbeck:
And I think there's a lot of thinking left to be done on how do we balance all those factors.

sean_lukasik:
It seems like in the United States, the pivotal moment that we're moving towards is more about privacy maybe than engagement. And that decision about privacy, especially on social media sites and who controls the data, is one that has been made already in parts of Europe and other parts of the world. And so far, the data that we provide belongs to the companies that collect that data. And it seems like if and when the decision is made about that, that we might get clearer about the future of social media, do you think we're headed in the right direction in terms of the

jen_golbeck:
Yeah, we're slowly crawling our way down that

sean_lukasik:
Yeah.

jen_golbeck:
path. It should be happening much faster. But I think the thing that's ultimately going to push us there, not as fast as we should, but I think that we will get there, is that lots of states are passing their own privacy regulations. California has one that's not the same as GDPR in Europe, but it's got flavors of that too. It's much stronger than the federal protections we have. But a lot of states are following suit. state laws that conflict with one another. And

sean_lukasik:
Mm-hmm.

jen_golbeck:
that is absolutely the thing that pushes you to then have an overarching federal law to resolve those problems. So, you know, I think we're moving towards something better. That said, like it's such a fundamental shift in the law around data in the US to say that data is owned by the people that it's about, as opposed to the companies that hold it.

sean_lukasik:
Hmm

jen_golbeck:
Europe has GDPR, but that's because the law in Europe has been for a very long time that people own the data about themselves. PR just kind of codified that for the continent. That's not the foundation of the law in the US. So that makes the problem harder. I hope we get that shift. I think if we had that fundamental shift, it would fix a lot of things. I think there's also a lot of piecemeal solutions kind of in the works. I have seen a proposal for a law that would make it, that would essentially ban these personalization algorithms we've been talking about for kids under a certain age, maybe under 13 or whatever, right? They

sean_lukasik:
Yeah.

jen_golbeck:
are really manipulating those young minds at this point. They manipulate us adults who know exactly what they're doing and we can't resist it anyway. It's really potentially very damaging to young kids. And I think if we start seeing some of those things get in there, we start kind of cracking what has been a really impenetrable wall between people and the companies and getting a little bit of control back. So I'm cautiously hopeful, but unfortunately, I think it's going to more reasonable in place.

sean_lukasik:
Yeah, and it's scary to think about what those companies can in fact do with the data that we're talking about. The classic example that you've used in talks and on social media is, you know, target, the target example. optimized for a company's bottom line and end up feeling really gross to the customers or feeling really like intrusive. Are there other examples or maybe talk about the target example where those algorithms designed for the company's bottom line are really like

jen_golbeck:
Yeah, I mean the so the target example you're talking about it's about a 10-year-old story where target was could Essentially they analyze people's purchase histories to find out if you have a woman who's pregnant They can figure out what her due date is based on things that she's bought that are not obvious indicators of pregnancy And the story broke when they found out this 15-year-old girl was pregnant before she had told her parents like she got a flyer with

sean_lukasik:
Mm-hmm.

jen_golbeck:
Since then, there has been volumes of work in this space. We can find out basically any demographic trait, race, religion, gender, sexual orientation, behavioral things, drinking, smoking, drug use. But we can also predict future things. So we've done a study in my lab predicting if people will stay sober for 90 days when they go into Alcoholics Anonymous. And we can do that on the day they go into AA by analyzing their Twitter feed. Georgia Tech did a really interesting study where they can predict if a woman will develop postpartum depression, kind of on the day she gives birth by analyzing her social media. Cornell, they did a study where they can find out if you're at risk for breaking up with your partner, basically

sean_lukasik:
Mm-hmm

jen_golbeck:
based on like the structure of your social network. There's just all of these clues about our personal traits and also our future behavior that are pretty easy to model with And you know, some of that you go, well, like I don't really care if they know my race, religion, gender, and sexual orientation. That's all pretty easy to find. On the other hand, you know, we look at some of the applications. I just tweeted a story yesterday about how this company, the Catholic church was using this company in Colorado to track gay priests and like looking for them showing up on, on apps like Grindr. And it's like, on one hand, the data's out there. On the other hand, the fact that you've taken the time to aggregate all of that and are in ways that you can clearly see is harmful for that group,

sean_lukasik:
Mm-hmm

jen_golbeck:
that's really upsetting. And there's a lot of companies who specifically set out to say like, we're gonna profit on, in this case, the Catholic Church wants to track gay priests, but we wanna track people at the border. We wanna screen immigrants and they make them hand over all of their social media logins and run these algorithms. We're using it for these kinds of algorithms for estimating people's recidivism risk and using that to decide if they're given bail probation, we know their algorithms are racist and biased. So there's just this massive space of problems. And I think these social media companies get away with doing a lot of this because people aren't really aware of the power of those algorithms. You can go like, personal traits, who cares if they know that? But, but when you see the vast universe of things they can do and the harms that can be caused, then I think people would, would stand up and, and start arguing about it. angry about it.

sean_lukasik:
Yeah. Yeah, no. And you've done a great job at that. You know, it's scary, but it's fascinating. And it's very helpful to have the information that that you often talk about. Just as a check on my own personal use of social media and for my friends and for, you know, the young people in my life. stuff at some point, but let's balance it a little bit. I know that you've worked on some of your own algorithms. You've talked about an early project where you were trying to, was it like a movie, a movie database or a movie social network, similar to sort of Netflix, where making recommendations for the types of movies that someone might enjoy was the whole goal of the algorithm. What did you just sort of like writing that code yourself, seeing it work, getting into the weeds about that. And then of course, spending a lot more time throughout your career, just studying it from there.

jen_golbeck:
Yeah, I mean, that was like my dissertation projects. You went way back to find

sean_lukasik:
Ha ha.

jen_golbeck:
that one. Yeah, you know, that project that I did, which is, and bits of that have made their way into all these algorithms, good and bad, that we interact with now, was just thinking about, can we use our social connections to inform those kinds of recommendations that we get? Is that helpful? And we know it is now. That shows up all over the place. early 2000s when we were doing this kind of recommender system research, right? Like Netflix suggesting stuff. It was kind of based on ratings. So if I, if you and I rated a bunch of movies in the same way, it would maybe suggest movies that you like to me because our tastes are similar. So my work was kind of looking at what if we bring social relationships into that. I think a really interesting thing that I found from that is how easy it is to screw up the trust that you have in I was having people rate kind of movies on the AFI's top 100 movies list. And one of the movies on that list is A Clockwork Orange, which I have seen. And it's a couple hours of my life I wish I could have back. My life is worse for having watched that movie. Now I understand like from a filmmaking perspective, like why people like it. There's nothing in me that needed to see that movie. And what I found is like I was really analyzing this data is that anyone who be a good rating. I did not want to hear anything from them at all,

sean_lukasik:
Mm-hmm

jen_golbeck:
right? Because it just was like, that one thing is enough to entirely destroy my trust, because if you would recommend anything like that to me, that's going to be kind of profoundly harmful as much as that makes sense in a movie context, right?

sean_lukasik:
Sure.

jen_golbeck:
So it turned out that like we had always been thinking about like the right people to give you recommendations or the ones who are really similar to you. But for me, thing we disagree on and that one thing is the deal breaker and that actually turns out to be really important because people can lose trust in these systems right if Netflix just accidentally drops in a few things that are really disturbing to you you're not going to listen to them at all even if they're right on 95 percent of things so that was that was unexpected and a pretty interesting thing to find that that started literally from myself and looking at the data and and finding myself in my head going like oh i'm never talking to that person about movies again

sean_lukasik:
What makes you hopeful about the technology that we have today and how far it's come, you know, during your career? I wouldn't expect it to be an interview with you without hearing dogs in the background, by the way.

jen_golbeck:
I'm sorry, there's a boat outside.

sean_lukasik:
That's fantastic. Yeah, so what makes you hopeful about the technology that we have today and what it's what it's capable of, even though there's, It feels like it can do some pretty phenomenal things.

jen_golbeck:
Yeah, you know we always it's important to talk about the dark stuff and I feel like that is a lot of the conversations

sean_lukasik:
Yeah

jen_golbeck:
I have but you know, I Remember not having social media. I I remember like when blogs were babies, right? and like not

sean_lukasik:
Mm-hmm.

jen_golbeck:
many people had those and it was really hard to put content online and The thing that we have gotten from social media is this ability to find community for whatever we care about Even these tiny little topics that like nobody really think of ways that can be so helpful. You think about someone who's in like a marginalized group in a place where they don't have access to anybody who is experiencing what they're experiencing. So that could be someone with a rare disease, right? There's 200 people in the US who have been diagnosed with this life-threatening condition that you have. Well, you can go online, they're all going to be in a group on the internet and you can talk

sean_lukasik:
Yeah.

jen_golbeck:
to them about what they're doing, where before, you probably would never talk to anybody who had that condition. it, you know, say you're a gay Muslim teen in rural Texas. I mean, how many, how many people are that like that where you are, maybe one or two, if you're lucky. And, and that's a very unique and different experience based on all of those demographic characteristics. And you can go on social media and connect with those people and find them and get resources. And it's hard if you were not like experiencing the world before social media to know how and from information that's now critical to the way that we interact. And that's something that, you know, I'm either, I don't know, a zennial, an aging millennial,

sean_lukasik:
Ha

jen_golbeck:
right? Like I'm right

sean_lukasik:
ha.

jen_golbeck:
on that borderline. But, but you know, kind of grew up using computers and things. But remember what it was like before we had access to that and how hard it was to get things and, and how much voices were silenced, right, that you just had to accept what those major channels were giving And I think it's like a profound shift in our society that now we don't have to rely on that. And in fact, we see big social movements like Black Lives Matter being fueled by the fact that in Ferguson, they had access to Twitter and were posting videos where 10 years before that, you would have got the sort of major news networks coverage, which would not have matched the experience on the ground. And that gets

sean_lukasik:
Hmm.

jen_golbeck:
overridden when everybody has access to shares. So there's profoundly good things that come from it too. And I think that's why it's important. We don't try to throw it all out because there's scary things. We try to figure out how to solve those scary problems.

sean_lukasik:
Yeah, absolutely. And, you know, I know, and I do want to talk about some of like the conspiracy theories that can really grow and the dark side of that piece of things, but it's, it's heartening to hear about the the ways that people connect these days. when it felt like COVID was really closing in on us, meaning, you know, there it was in parts of Asia, and then we're seeing these really, scary decisions being made in countries like Italy and parts of Europe where they were just shutting things down. And it felt like it was moving towards us, but it also felt like we were learning and seeing and hearing stories of hope and excitement from those people in other parts of the world. So by the time it got to us, which was, which is almost exactly three years ago now, we, I, I felt a little bit like there had been a lot of already been a playbook, so to speak. And so, yeah, I love hearing from you the optimism and the hope that you have from some of these things.

jen_golbeck:
I think it's important to see that. And I mean, I think that's a really interesting example, too, because frankly, in terms of culture and information, in the US, we tend to be the ones sending it out to

sean_lukasik:
Hmm.

jen_golbeck:
everybody else. And COVID is a really interesting example where we were sort of lagging on the outbreak here. And you're absolutely right. Could benefit from all that information that everybody else had. And it was obviously a really scary time here, too,

sean_lukasik:
Yeah.

jen_golbeck:
where if you were in China two months earlier, they didn't know anything,

sean_lukasik:
Yeah.

jen_golbeck:
right? And we benefit from that.

sean_lukasik:
Yeah, and we learned a lot, you know, hour by hour, day by day, it felt like at that time. So any

jen_golbeck:
Yeah

sean_lukasik:
bit of a head start that we had felt like an advantage.

jen_golbeck:
Absolutely.

sean_lukasik:
So to dive right back into it, I know that you you wrote about Kelly Wiles book off the edge where she covers flat earth conspiracy theorists and talks about how how easily those conspiracies video and all of a sudden it becomes your entire recommended recommended library of conspiracy theories and not just the one but so many others. And in your review of that book, you said as a social media researcher who spent the past several years studying what goes on in the murky corners of the internet, my heart ached with recognition. So what is it that you recognized in that book about conspiracy theories and the that they spread.

jen_golbeck:
Yeah, you know, I obviously spent a lot of time looking at QAnon and lurking in those groups and seeing the kind of things that they were talking about. And you know, at the heart conspiracy theories and people who believe in them share a lot of common things, they make you feel like you have special knowledge, which is a thing that we all like, right? I know a secret and nobody else knows it. I worked briefly in the intelligence community and like the first time you get that top secret like you feel so powerful and awesome. You don't learn anything interesting most of the time,

sean_lukasik:
Yeah.

jen_golbeck:
but it's like, I know this thing and you're not allowed to know

sean_lukasik:
You don't

jen_golbeck:
it. Like

sean_lukasik:
go

jen_golbeck:
if,

sean_lukasik:
straight to the alien archives once you get that clearance.

jen_golbeck:
I tried, they don't let you in there.

sean_lukasik:
Okay.

jen_golbeck:
They don't let you just go into that stuff mostly. You're like, here's the number of tanks that this country has. And you're like,

sean_lukasik:
Right

jen_golbeck:
I think we kind of knew that. So usually it's pretty boring.

sean_lukasik:
Yeah

jen_golbeck:
But that like unique knowledge, feeling like you've figured something out is intoxicating and interesting and conspiracy theories. Bring that to people. And I think now, in light of all this other stuff we've talked about with social media, experts in a way that we didn't before. And that can be powerful and amazing. And it can also be really intimidating. Because if you say go back to the 80s or 90s, if I was like, you know, how did this rocket possibly work like that? I

sean_lukasik:
Mm-hmm

jen_golbeck:
would look it up in the encyclopedia or maybe go to a card catalog. Like, and now I can say, how does this rocket even work that way on Twitter? And like to me with

sean_lukasik:
Yeah.

jen_golbeck:
information. So the smart person who kind of understood stuff that you could be before the internet doesn't work anymore. You can't be that person who like sort of understands physics or chemistry or biology and gives your best guess because the actual experts are there and will tell you how it works. So that can be disempowering for people who feel like they're really smart and

sean_lukasik:
Hmm.

jen_golbeck:
and they can get really upset. And as a woman who knows a lot about certain kinds of technology and statistics and math that people misuse on the internet, when I correct them, I absolutely get the brunt of their anger at

sean_lukasik:
Yeah,

jen_golbeck:
being corrected

sean_lukasik:
yeah

jen_golbeck:
on what they're doing. And so conspiracy theories offer this space where you get to reject those experts who are telling you that you're wrong about a thing that you may really wanna believe in. And that maybe, you know, they're not gonna tell you necessarily that you're dumb, but essentially there's this implication dumb for

sean_lukasik:
Mm-hmm.

jen_golbeck:
doing

sean_lukasik:
Mm-hmm.

jen_golbeck:
that. And instead, you get to believe in this thing where you kind of get to create the facts around it. And we really saw this with QAnon. But you see it with the Flat Earth, there's two, they'll kind of someone will come up with an idea and then everyone will be like, Yeah, this idea, you're so smart for coming up with that idea. And we're all going to try out that idea. and these dark corners give you a place to find those people. And I think really importantly find a community who's gonna embrace you, who's gonna celebrate your interest in a topic, who's gonna tell you that you're smart. And that can go really well until it goes bad. But I absolutely see how it attracts people. And the same things that she talked about in her book on Flat Earthers, you see that echoed in all the conspiracy theories and QAnon is just one of the clearest ones.

sean_lukasik:
Now, YouTube spent some time a couple of years ago adjusting its algorithms so that they wouldn't necessarily just continue to serve conspiracy theories if someone liked or watched an entire documentary on flat earth conspiracies. sweeping changes or you know what does that look like when you do adjust those things for safer and more reasonable recommendations.

jen_golbeck:
Yeah, it's both like very easy and very difficult, depending on which part of the problem you look at. If we know that a video is about a conspiracy theory, it's really easy to deprioritize that, not recommend it, whatever. So like that part technically super easy. How do you figure out what the thing is is a much harder question? And I was actually at a meeting at the National Academy of Sciences that was a group of researchers experts and a group of people from Google, which owns YouTube, talking about exactly these problems. Google helped sponsor this event because they're like, we need to figure out how to not do this. And one of the biggest spaces that they had problems with this was in anti-vax and kind of health conspiracy stuff on YouTube. And not even stuff that's like super mainstream at this point. So like there's a whole set of videos that said eating frozen lemons will cure your cancer. freezer lemons, cut them up, eat them, cancer will go away. This is not true. This is absolutely

sean_lukasik:
Yeah,

jen_golbeck:
not true.

sean_lukasik:
yeah

jen_golbeck:
But there were a ton of these videos are still out there. There's a ton of them. And someone who I think was a former FDA commissioner came in to YouTube and was like, you're killing people by showing them this information. Because they find this video, they see a whole bunch of it. And if one person opts not to get the actual treatment that they need, they're going to die. And it's because of your algorithm. And I think that really to try to figure out, all right, how do we solve this problem? And it doesn't just have to be these frozen lemons. How do we stop pushing this kind of content on people? So the hard part of solving that is figuring out, what is that content? Because sure,

sean_lukasik:
Mm-hmm.

jen_golbeck:
like frozen lemons are easy. Like, actually, they do not cure cancer. But

sean_lukasik:
Yeah.

jen_golbeck:
there's a whole lot of gray area, like in the healthcare space, right?

sean_lukasik:
Sure.

jen_golbeck:
Which I think we're all kind of getting familiar with as we've gone through COVID, right? There's

sean_lukasik:
Yeah.

jen_golbeck:
stuff where it's like, would work and then it doesn't work and people don't know how to interpret the studies, but there is stuff to actually be worried about and you know what actually counts in that space, like that's really hard. That said, a thing that drives me crazy about this is that there is very straightforward stuff that we can deal with, right? There

sean_lukasik:
Yeah

jen_golbeck:
are known fake news sites, there's known misinformation sources, there's known accounts on all these platforms that are like It's established. Let's block them. Right. Like let's stop

sean_lukasik:
Mm-hmm.

jen_golbeck:
sharing the stuff that they create. And, and I think the reason that we haven't seen action that overt and simple to the easy part of this problem to solve, um, comes down to a policy question, which is all these companies, I think are very sensitive about, are they being editorial? And this gets into the whole section to 30 law that kind of protects social media companies from being responsible for what people do on their platforms. They're legally very aware of that. very careful about these choices where like if I were in charge I'd be like I don't care like ban

sean_lukasik:
Ha.

jen_golbeck:
all like block all of it but but you know they legitimately do open themselves up then to this question of like are you making decisions about content and then are you maybe responsible?

sean_lukasik:
Interesting. And so to kind of, well, let me first ask you, you spend a lot of time in the dark quarters of the internet. Where do you go at the end of your day before,

jen_golbeck:
Hehehe.

sean_lukasik:
before you close the computer or turn the phone off? Do you do you find like an eye bleach subreddit and look at cute animals for

jen_golbeck:
So I actually like started putting my dogs on the internet because of this problem because I was spending so much dark time there and kind of after the 2016 election everybody on the internet was mad regardless

sean_lukasik:
Yeah.

jen_golbeck:
of who you supported for the election or you know Brexit the same thing everybody was mad and I was like I need a corner that's happy and there kind of weren't those corners so I was like I'm gonna make one and I'm just gonna post pictures at the time I had four golden retrievers So I end my day on like our social media about the dogs. I can look at my dogs in person, but we've ended up building like this really beautiful, supportive community that I've worked hard at to like block all the jerks so they can't get in there.

sean_lukasik:
Yeah.

jen_golbeck:
And it's just this lovely place of people being nice to each other and saying nice things. It's very wholesome. And that's where I end every day before I go to bed.

sean_lukasik:
Well, thank you for creating that. When I gave my TED Talk 10 years ago, I was talking about the difference between a friend in air quotes that you have on the internet, someone that you just might follow or someone that you knew decades ago that you don't really keep in touch with anymore. And what I deemed as a Paesano, which is someone who you have a close relationship with someone the key difference being that a pizano is someone who actually has physical benefit on your body, on your health. And by having real relationships, we know that we can impact positively our mental and emotional help, we can impact even our physical health through good solid relationships. relationships. Now I gave that talk 10 years ago. So I think that that's changed a little bit since then you use some examples of, you know, someone has a very rare disease, only 200 other people in the world has have the same disease. Now they can reach out and and talk with those people. you've talked about how dogs make us physically healthier. You co-wrote a book on this. And can you talk a little bit about how dogs are the Paesanos we really need in our lives?

jen_golbeck:
It's so interesting and it's really in line with what you talked about in your TED talk. So I have been a dog lover my whole life and we rescue special needs, golden retrievers now so it is like a challenging but joyful part of my life. And I've done a couple studies on dogs and the internet but I had become fascinated probably a major cardiac event, so heart attack, a stroke. If they own dogs, out of this sample, and I'm not going to get the numbers right, but it was like, you know, 100 people who had these cardiac events who had dogs and then 100 people who didn't have dogs, out of the people who had dogs, like one person died in the year or two that they followed up. And among people who didn't have dogs, it was like 15 people died.

sean_lukasik:
me

jen_golbeck:
It was this massive difference. Well, you have a dog, you walk the dogs, you're exercising more and that's why you live longer. They control for that and it still is a big benefit if you have a dog. This was fascinating to me.

sean_lukasik:
Mm-hmm.

jen_golbeck:
I mean, it feels good, right? I love dogs.

sean_lukasik:
Yeah,

jen_golbeck:
It makes me

sean_lukasik:
yeah

jen_golbeck:
feel like I'm doing the right thing. But why was this the case? And I didn't really know the answer and set out to write this book just about, it's called the Peerisbind Bond and it's about all of the ways that our lives are made better by And the thing that emerged out of just reading all of the research on mental health, physical health, emotional health, our social connections, our dogs make all of those things better. One of the big themes that came out of it that actually explains a lot of these physical health benefits is that we know psychologically that if you have a strong emotional support system, you have a close family, good friends, you're going to live longer. physical to your physical health.

sean_lukasik:
Mm-hmm

jen_golbeck:
And dogs will actually fill in for that. Dogs can become that social support for you. They listen to us. They are empathetic. Like our dogs can tell when we're upset and they come to us. In psychology, there's this idea called attachment theory or attachment bonds, which is sort of the bonds that babies make with their mothers, like very

sean_lukasik:
Hmm

jen_golbeck:
early on. And we of course can say, well, yeah, of course, I love my dog, I'm they played sounds of their people talking versus kind of strangers. And the same part of the dog's brains lit up in the fMRI as in baby's brains when they had access to their mother. So

sean_lukasik:
Hmm

jen_golbeck:
it sort of suggests that dogs are actually forming that same kind of attachment bond with us. So they really legitimately are these kind of family like social support factors. And in all of these physical health studies, you see that the dogs help and they who don't have strong social support systems, which really emphasizes the dogs will fill in even if people aren't there for you. But if you have like lots of close family and friends, like, I mean, dog is great, they're not giving you a thing that's missing. But if that's missing, then you see this really huge impact, which is pretty fascinating.

sean_lukasik:
What do you think we can learn from them? They don't they're never on their cell phones.

jen_golbeck:
Hehehe.

sean_lukasik:
They're never distracted When we're trying to talk with them or when you know, they're hyper focused when we say the thing that they might want to do What do you think we can learn from from their lifestyle and their attitude towards life? We're living in such an internet ruled Culture in such an internet ruled world. What are some of the lessons that you've learned from? Golden Retrievers,

jen_golbeck:
Heh.

sean_lukasik:
I understand, but from the dogs that we love in our lives.

jen_golbeck:
It's funny that you say that I was literally like walking my dog last night and being like he's never on his phone

sean_lukasik:
Ha

jen_golbeck:
So I

sean_lukasik:
ha.

jen_golbeck:
should really let him like sniff this bush because that's about as close as he's

sean_lukasik:
Yeah.

jen_golbeck:
gonna get to that and like that There's something really nice about that It's interesting so like dogs are Emotionally sophisticated they they have a lot of emotions But they their brains don't seem to get to the point where they can feel shame or guilt they can feel fear. And I think that's a thing I've been thinking about just in the last couple weeks. Like if you're a human who doesn't feel guilt, you're kind of a psychopath,

sean_lukasik:
Yeah,

jen_golbeck:
right? Like you do bad things like this bad.

sean_lukasik:
the definition of one

jen_golbeck:
Yeah, but it's actually like really nice in dogs, which

sean_lukasik:
Yeah

jen_golbeck:
maybe speaks to their better natures. But they're very mindful creatures. And I certainly get this forced So they kind of forced me to live in the moment with them too. I can't be like everybody just hang on. They're like, no, no, no, like this is the time for whatever the thing that's now is the time. They don't know what later is. And

sean_lukasik:
Yeah

jen_golbeck:
that can be really good to be like, you know, occasionally they have to wait, but a lot of the time I can stop what I'm doing. And I think that's really great. For me, one of the biggest lessons I've learned though, rescuing dogs is that almost all these dogs that we take in come from pretty rough backgrounds. definitely neglect, oftentimes explicit abuse. You know, we have one now who was kind of locked away in the cone of shame for five years. We had another

sean_lukasik:
Mm.

jen_golbeck:
who spent his first six years on a chain and when he got sick, as people said to just put him down because they didn't want to deal with him anymore. You know, they have no reason to ever like or trust a person again. And they come in very afraid. And it takes like two months. And You know, some of them get to their final state faster sooner than others. But even this dog who is on the chain, right, he's a diabetic. He went, he'd never been to the vet. He went blind because of his diabetes. And when his previous people saw he was running into stuff, they brought him to the vet for the first time. The vet said, he had diabetes, you got to start giving him shots. And they said, we'll just put him down. We don't want to deal with that. So a vet tech took

sean_lukasik:
No.

jen_golbeck:
him, gave him to us, but he had, you know, every tick-borne disease. His fur was falling out. perfectly square bald patch between his shoulder blades where the chain rubbed the fur away. He had never had any human affection. And he was a nightmare. I mean, I cried for like six months trying to manage this dog.

sean_lukasik:
Yeah.

jen_golbeck:
It was not house broken. He didn't know how to get along with the other dogs. He was running into stuff because he was blind. And you know, it's been about 18 months. His vision's coming back. His diabetes is controlled. He's all fluffy. He loves the other dogs. He's like super snuggly and great. To forgive or at least move past what happened before Man, I aspire to have that and it's a thing with all these dogs that we rescue that that I see a lot and Reminded of all the time. I know what they were like a few months ago when they came in and and how utterly Transformed and happy they are having completely let go of all of that. That's pretty amazing

sean_lukasik:
Well, thanks for sharing that. And thank you for taking the time to have this conversation. I really admire the work that you do. I admire that you do spend so much time in those dark corners learning on behalf of all of us. And that then you choose to put the dogs first and foremost in front of everybody so that you can use that knowledge to make sure that they can't have themselves. But truly, I recommend your work all the time. And if there's any one talk or place where people can find out more about the work that you do, or if you just recommend that they follow the dog account,

jen_golbeck:
Thank

sean_lukasik:
the golden

jen_golbeck:
you.

sean_lukasik:
ratio, maybe that's it. But is there anywhere that you would suggest that people go to find out more about what you do?

jen_golbeck:
So definitely follow dogs because that's gonna make your life best, but

sean_lukasik:
Mm-hmm.

jen_golbeck:
I'm pretty active on Twitter Sharing stuff that kind of covers the gamut of what we talked about today I tweet sometimes, but I think I'm a good retweeter of important things So I'm Jen Goldbeck on Twitter and that's a great place to start

sean_lukasik:
Great. Well, thanks so much for joining me on the Paesanos Podcast and enjoy the weather down in Florida.

jen_golbeck:
It was a pleasure. Thank you.

The State of Engagement, Optimization, and Dogs
Broadcast by