My thoughts on narcissism and social media

Popular media has been giving significant coverage to a recently published article on the relationship between narcissism–a personality trait that captures admiration or love of  self–and use of social network sites (SNSs).  This line of research is not new (see here and here[pdf], for example), but it brings more attention to an ongoing discussion regarding Gen Y, technology, and the “downfall” of modern society. More broadly, it considers long and ongoing questions related to the potential negative impacts of new technology on individuals, groups, and relationships.

While none of the studies I’ve referenced above establish causality, media stories I have read suggest that sites such as Facebook are, in fact, making today’s youth more narcisstic. Take, for example, the headline, “Facebook Feeds Narcissism, Survey Finds” from a CNN article about the recent study. In response to this, I (first) get an exasperated look on my face and (second) direct the authors to the following xkcd strip:

As to my exasperation, first let me comment on a pet peeve of mine. As I have been in media in one form or another now for more than a decade, I understand that catchy titles draw in readers; however, bad reporting of data and not acknowledging limitations of studies often leads to gross exaggeration of research findings. Academic studies are typically written in a way that make them inaccessible (both literally and cognitively) to the average person. This is a major reason we have journalists: to transcribe technical findings into a format that nearly anyone can understand. Due to either a lack of knowledge on the part of the writer, the desire to write a more “interesting” story, or sheer laziness, this process often ends like a game of “telephone,” with the final story only somewhat resembling the original. And while I don’t expect your average newspaper or broadcast news writer to understand the technicalities of a newly published stem-cell study (heck, I would barely understand that), distinguishing between correlation and causation is essential to accurate reporting. Furthermore, it is essential that when reporting significant findings (such as those which could have an impact on policy down the line), writers acknowledge potential limitations to the findings, such as issues of generalizability, small sample sizes, or small effect sizes.

Okay, back to the findings. The most recent study found positive correlations between various components of a user’s profile and narcissism. Unfortunately, the coding of profile components appear to be seriously flawed for a number of reasons. First, coding was only performed by the author, which prevents inter-coder reliability from being established and potentially introduces researcher bias: because the author is also the coder, she could unconsciously have coded content to reflect the findings she wanted. Another concern is that the author provides no details regarding if she created a code book or, assuming she did, how she established criteria for coding. For example, she states that use of positive adjectives such as “nice” and “funny” in the About Me section of the profile were coded as indicators of narcissism, which I find to be a bit of a stretch. Likewise, she coded the use of photo editing software for the profile picture to be an indicator of narcissism, which raises two concerns from me: (1) photo editing software is often not easy to detect, and (2) in cases when it is, it may be serving an artistic purpose rather than an egotistical one. Because the author had no interaction with participants regarding why they made decisions regarding content choices, it is impossible to make assertions as to the reason behind these choices.

A second point related to this and the other studies I have seen on narcissism and SNSs relates to the technology itself. SNSs are centered on sharing information; they are designed with the intent to simplify the process through which users can post content to an audience. Facebook prompts users to post status updates with the query, “what’s on your mind?,” which is a direct request for information about the individual. Therefore, we need to be especially careful in creating operational definitions of what constitutes  narcissistic content in order to make sure we are measuring what we set out to measure and not merely capturing standard practices on the site. Furthermore, research should work to establish behavioral norms on the site — what a researcher perceives of as “narcissistic,” such posting a specific type of photo, may in fact be the norm for that given group.

A third consideration to consider when interpreting the results of this and other studies on narcissism and SNSs is the choice of population. Early research on SNSs (and, to be honest, the vast majority of current research as well) tends to employ college student populations, most likely because college students were the primary users of the site and college students are a convenient sample for academic research. College students are, in many ways, in a four-year transition from youth to adulthood, a period that J.J. Arnett refers to as “emerging adulthood.” It is unsurprising that people at this age are self-absorbed — not only are they trying to figure themselves out, but they’re apt to try a variety of ways to fit in, which probably requires a bit of self-promotion.

So is “Generation Me” a more apt name for Gen Y (as at least one book has suggested)? Or have advancements in technology merely made young people’s narcissistic tendencies more public than previously possible? While only research can answer this question, one study provides some initial insights: a recent meta-analysis of studies conducted between 1976 and 2006 found no relationship between cohort and egotism, individualism, self-enhancement, or self-esteem. In other words, kids today are just as caught up in themselves now as they were when their parents were there age; they simply couldn’t share their thoughts on how awesome they were with the rest of the world as easily.

In looking forward, an important next step is to conduct more research with non-college student populations and to identify ways of studying new technology adoption and use over time so as to address questions related to the types of changes in beliefs and behaviors that social media may be effecting. I’ll be presenting research in November at the National Communication Association’s annual conference on how adult users (ages 25-55) negotiate the tensions between using Facebook to obtain social capital benefits and concerns related to making those disclosures. I also hope to employ multiple methodologies (both qualitative and quantitative) in some future studies so as to achieve both breadth and depth of findings. More on that as it develops, but I should probably stop talking so much about myself, lest you think I’m narcissistic. 🙂

Methodological differences between academic and media-based research

The other night while flipping channels on TV, I came across the show What Would You Do? on ABC. For those of you who have never seen it, WWYD is a hidden-camera show that places people in a wide variety of scenarios to see how they would react. The premise is that people often say they would perform a given action hypothetically–whether helping an injured stranger on the street or calling out a person stealing from a store–but they rarely go through with such actions when actually in that situation. For example, take the story of Kitty Genovese, which led to what is now known as the “bystander effect,” whereby people tend to not offer their help during an emergency situation because they assume someone else will help. The show tends to deal with a wide variety of ethical issues–none quite as serious as witnessing a murder and seeing whether people respond–ranging from witnessing a young, intoxicated woman being lured out of a bar by a stranger to seeing a waiter discriminate against a gay couple.

The episode I saw included a segment on the dilemma of whether to tell a friend when you witness his/her significant other cheating. The WWYD crew recruited a couple and created a situation where a friend of the couple saw the man at a restaurant cozying up with another woman. Then they had the woman supposedly being cheated on come to the restaurant for lunch with the friend who just witnessed the transgressions to see what the friend would do. The embedded clip is from 20/20 but has highlights from the episode I watched.

As yet another sign that I have spent too much time in the ivory tower, my immediate reaction upon seeing this clip was, “no IRB (institutional review board) would have approved that study!” For me, it harkened back to Milgram’s experiment on the extent to which individuals obey orders from authority figures. Obviously these experiments (and I use “experiment” quite loosely when talking about WWYD) were dealing with entirely different questions; however, Milgram’s experiment is one of the foundations for dealing with questions of ethics in research, and especially about justifying placing participants under extreme emotional stress.

Merely watching the woman who was dealing with this dilemma made me uncomfortable, for she was being placed in quite an uncomfortable position and it was clear she was under emotional duress. Try placing yourself in her position for a moment and consider how difficult it would be for you had you just learned that one of your closest friend’s significant other was cheating on her. Then consider how you would feel when you learn the “joke” is on you and not only did all your friends know the situation was a sham but now the whole nation can see you go through the process of freaking out. (Not only that, but in the video, the woman offers her friend a Xanax as she tells her the bad news; had her friend not had her own prescription for Xanax, she could have potentially be in trouble with the law for violating federal regulations regarding transfer of controlled substances. Yes, I’m sure that wouldn’t have made the final cut,  but still, not cool.)

At the same time, there’s a flip side to my academic-derived disgust at the TV show’s sensationalized take on conducting research, and that is that not all questions can be answered through IRB-derived studies. Sometimes it’s faster, easier, and cheaper to conduct your research through a media or corporate organization. There are plenty of non-academic research organizations whose research I have the utmost respect for, even though they don’t go through a rigorous review-board-driven process before conducting their studies (and I worked for one for three years). Even without an IRB, they most likely follow an internal set of ethical guidelines when conducting their research.

The thing that concerns me is when your average person can’t distinguish between research that is rigorous and methodical and that which is done, for lack of a better-word, half-assed. Maybe WWYD takes tons of precautions, provides thorough debriefings, takes every effort to ensure their unknowing participants don’t experience any post-experiment trauma, etc. I don’t question their desire to break ground or previously unstudied territory, but I do know that if I had been the woman in the video, I would have been pissed at a lot of people–and especially the friends who set me up–for a very long time. Was their findings worth the emotional stress to this woman went through? I’m not sure.

This leaves me with a lot of questions. When is this kind of research justified? Are there questions that only academia (or only non-academia) should tackle? Are there ways to conduct experiments such as this that could answer the same questions while being acceptable to an IRB (even if they’re not being submitted to one)? If so, is it unethical to conduct the original experiment?

I have a feeling I’m going to be thinking about this for a long time.

Why I quit Farmville… and why I think you shouldn’t

I need to make a (somewhat embarrassing)  disclosure. Yes, I once played Farmville. And Cafe World. I started playing out of both curiosity and the sense that I needed to try it out because of its relationship to my research; I stayed because it hooked me the same as it hooks millions of other players. Yes, I wasted hours planting and harvesting virtual crops and “cooking” virtual dishes. I never took to the creative aspect of the games by trying to create the most aesthetically pleasing space (mainly because I’m not very artistically inclined) and I never posted about it to my News Feed (because that is quite annoying). Instead, it was the competition that kept me–I certainly took a small pleasure from “beating” fellow network members at leveling up, at least in the beginning. After awhile, even that becomes a chore as levels take days, if not weeks, to increase just one level.

Then one day earlier this year I quit on impulse; I removed all game-related applications from my Facebook. And I felt a relief when I did so, like I had taken out the trash, trash that was starting to stink.

Okay, maybe I’m exaggerating a little on that last statement. But I did feel some relief when I quit playing the games, like I no longer had to hide a dirty little secret–that I played the most annoying game on the Internet–from my friends.  Why was I so loathe to admit to playing games that 240 million other people do? Why is Zynga (the company behind these games) the source of so much wrath, its games stigmatized by many? And to take a step back even further, what benefits (if any) could be accrued through playing these games?

While Zynga founder Mark Pincus is nothing if not a shrewd businessman (his company is set to clear $500 million in revenue this year and the company reached 100 million users years quicker than Facebook), his games also excel at inciting a deep-seeded hatred by non-players. If you Google “I hate Farmville,” 124,000 results pop up. A popular YouTube video spoofing a Farmville commercial opens with, “Are you tired of playing games that are fun?” And while I agree that Farmville and its cousins are not the action-packed games you often see on PS3 and Xbox these days, they’re certainly not anything new. People have been playing simple games like this for years: whether it’s Solitaire or the Adventures of Lolo, there is a slightly competitive element, but in the end, it’s merely just another way to pass the time.

I think why people hate these games so much is because of sharing abuse–as one interview participant noted in research I conducted at the beginning of the year, she hated that her friend kept “polluting her page” with constant updates about Mafia Wars. It should be noted that in March, Facebook changed some of its policies to halt the influx of notifications from third-party applications like Farmville–and that these changes are attributed to a multi-million-user decrease in players of these games between April and May 2010. But the onslaught of updates throughout late 2009 and early 2010, coupled with people not knowing that it was easy to hide all those updates from their News Feed, was certainly enough to try even the most easy-going person’s patience. And, unfortunately for Zynga, the games became synonymous with words ranging from irritating to devil-spawn.

My research team–led by the awesome Yvette Wohn–recently had a paper accepted at HICSS 2011 that looks at the potential positive outcomes associated with social network game (SNG) play. Employing qualitative methods (consisting of in-depth interviews with 18 adult Facebook users ages 25-55), we found some recurring game-related themes emerge across interviews. While some non-players may see SNGs as anti-social, players in our study repeatedly referred to social aspects of gameplay. We found three ways in which these games benefitted relationships between various sets of individuals on the site, which we classified as initiating, maintaining, and enhancing. Below is a (very brief) summary of findings related to these three types of behaviors.

  • Some individuals forged new relationships through SNGs. At first, these “friendships” were created so as to advance within the game, but oftentimes game-related interactions led to real relationship development and interaction outside of the game.
  • Many individuals play these games with existing offline friends as a relational maintenance strategy similar to the short communications that may occur through other communication channels in Facebook, such as wall posts of status updates. Especially for friends or relatives who were geographically dispersed, in-game interactions such as gift-giving acted as an alternative way to say “hi” or “I’m thinking of you.”
  • Finally, some individuals noted that gameplay allowed them to strengthen relationships with previously distant (or non-existant) ties. Establishing common ground through the game led to interactions that would most likely not have occurred without the impetus of shared gameplay.

So maybe Farmville isn’t evil after all. I know people love to hate on it, but these games represent yet another option in our relationship maintenance toolbox. Just because the tool is there doesn’t mean you have to use it, but it’s pretty apparent that beyond simply providing many people with entertainment, it may actually be serving a positive function for users looking to form new relationships and maintain–and even enhance–existing ones.

That still doesn’t mean I’m logging back on, but I guess I won’t diss it so much anymore.

I’ll post a pdf of the full paper once in a few months.

Update on stuff and junk

It is time to revive this blog. Yes, I realize my last post, dated more than 15 months ago, stated that I was going to blog more. However, I am now entering the third year of my PhD program, which means a few things: (1) my class-load has significantly decreased, and (2) I have a *lot* of writing in my future. As I prepared to return to Michigan for the new semester, I made a list of things I wanted to do when I got back to Lansing to help me be as productive as possible, and one thing that certainly helped me to be thinking creatively and critically while writing my master’s thesis was this blog. Keeping a blog related to my research interests helps me stay abreast of current issues in my field and to go beyond just a basic intake of information to considering how a specific piece of information fits into the bigger picture.

Because of this, I have made it a personal goal to blog as least twice each week during the next academic year on topics related to my area of specialization. And what, you ask, is my specialization? I study how online communication technologies impact relationships between individuals and groups, and how these technologies enhance, supplement, or detract from offline relationships. A lot of my research looks at the role of online social networking–and specifically the role of Facebook–in relational maintenance (see my CV for specific pieces I’ve written). However, my interests expand far beyond social network sites to online games, instant messaging, video chat, email, etc. If it involves computer-mediated communication, I’m probably interested in it.

This upcoming semester, I’ll be working on a number of projects that relate to these interests, and I’ll blog about my progress on them as I go along. And, on a positive note, I recently had three papers accepted to HICSS 2011 (Hawaii International Conference on System Sciences), which means a trip to Hawaii this January (woot!). The papers cover a wide range of topics: social network games, the relationship between bonding social capital outcomes and Facebook use, and the relationship between the avatar creation process and expectation of future interaction. Once those manuscripts are finalized, I will post .pdfs to my CV.

New blog name and plan of attack

My first year as a PhD student is quickly wrapping up…a little too quickly considering the workload I have left. But I thought I’d write a brief post to let people know (1) I’m still alive after somehow surviving my first Michigan winter, and (2) I plan to revive this blog to early 2008 levels over the summer.

First, you may notice the blog rename. I had grown tired of the “welcome to oblivion” name, as it had no relevance to my writings and was merely an obscure reference to a sci-fi book I was reading when I first started this blog. Thanks to @whatknows for suggesting I go with something more simple and “academic-y.” Next time I see you, I will be bearing cookies. The reason for the new name is rather obvious: (1) I study social phenomena, albeit only in relation to technology; and (2) I am obsessed with my name.  🙂 As many media outlets are now so kindly pointing out, I must be a narcissist since I have Facebook and Twitter accounts.

So what will I be writing about in the upcoming months? Most likely my primary focus will be on social gaming, specifically MMORPGs. At some point I want to put up some background research I did last summer for the Pew report on teens and gaming that did not make it into the final version. I’m also working on two projects currently that focus on identity and interaction in World of Warcraft. Otherwise, I hope to stay on top of current tech news related to social network sites, with maybe a post or two related to social capital (another summer project) thrown in here and there.

But since I’m already procrastinating by writing this post, I should probably hold off on writing any more until the second week of May. That is assuming, of course, I survive the next three weeks. Sigh.

Too bad “pluggies” never caught on … maybe the yuppies ate them

So I just came across a chapter in the book Culture in an age of money: The legacy of the 1980s in America, which refers to pluggies, “those who are plugged in but tuned out” (p. 84). Basically, it is referring to the rise of “me” technology in the 80s that allowed people to lock themselves in their homes and be completely asocial. This includes the rise of VCRs, video games, and computers.

I think the term makes complete sense, and as it is not encompassing of an entire generation, it may be more fitting than the typical response of, “that’s all of Gen X” or Gen Y or Gen Now, or any of the other innumerable names for those of us in the 18-35 age range. However, when I did a google search for the term, it only came up with ~2500 results, which means “pluggies” never caught on. Maybe the idea of yuppies, which originated around the same time, just overwhelmed people’s abilities to divide society up into groups so much that people couldn’t afford another denomination. Or maybe society just agreed the word is kind of silly.

So has anyone else ever heard of pluggies or come across the term in reading?

Is social media making me meaner?

I was chatting with a colleague earlier today and he asked for my input on something he had recently heard. Basically, he suggested that social media is making the population snarkier. The reasoning goes a little like this:

(1) If we assume that the reason most people post status updates, comments, Tweets, etc. is to get attention, and

(2) If we assume that snarkiness is more likely to get attention than otherwise banal posts, then

(3) Logically, people should be increasing the snarkiness of their postings.

Since I am rather obsessed with observing these media outlets, my friend asked if I had noticed this. And I had to really think about it. The logic does have a degree of face validity. It makes me think of Generation Me, a book I bought a year or two ago that I still haven’t read (I’ve been busy!). The book looks at people born after about 1970: a generation of people who are more self-absorbed and have less respect for others than their forebears. For the me generation, it often is about “me, me, me,” and social media support the projection–and sometimes shouting–of that individual’s identity throughout the world.

Look at Twitter. I will admit I am an avid user, and I use it for a variety of purposes, from keeping in touch with friends to posting news links to venting frustration (in 140 characters or less!). But if we break Twitter down to its most basic question–What are you doing?–it perpetuates the idea of me! Me! ME! The same can be said of Facebook status updates, which can be updated innumerable times a day if one so chooses.

But moving back to the question at hand, I have a hard time believing that social media are reshaping users’ identity in such a way as to make them snarkier, meaner, or posting solely to get attention. Obviously, these sites let users play with identity in a way that is more difficult–or even impossible–in an offline interaction. But why be mean to a friend on these sites when they know where you live? With Facebook at least, a key difference in these interactions from more anonymous sites is that the vast majority of Facebook “friends” constitute pre-existing offline relationships (see Ellison, Steinfield, & Lampe, 2007 for empirical support).

We can also look to other forms of media as introducing snarkiness into our daily lives. The two examples that pop to mind immediately as homes of snarky content are someecards and lolcats. So then, the question becomes: are sites like these a response to increasing snarkiness or are they making snarkiness more acceptable? Or both?

For me, the most basic question I come to is, Is snarkiness even a problem? I am about as snarky as a person can be, but I generally constrain my snarkiness in such a way as to make it clear that it is a part of my sense of humor and not a comment to be taken seriously. I also find myself evaluating my relationship to the individual before commenting on a photo or status update or responding to a tweet, and the snarkiness only comes out when I know the person will appreciate (or at least understand) the joke. But do I do it to draw attention to myself? Without probing too deeply into my subconscious, I would say not really.

So while I think this rationale for posting is feasible, at this time I don’t think it is necessarily the case. As ubiquitous as they are, SNSs still have something of that new car smell for many users, who still get excited when they find an old friend or when someone posts a picture from back in the day. People are genuinely interested in the conversation and interaction, much more so than getting their 596 friends to notice them. While I hate the saying, “you can catch more flies with honey than vinegar,” it is true for many people. Then there are people like myself and several of my friends, who gauge the closer of our relationships by how deeply we can insult each other (it’s harmless fun, I swear!).

Regardless, I think I’ll be taking a closer look at my Live Feed over the next week to see if any patterns of postings jump out to support this idea.

Blog Rename

While it originated as an obscure reference to a sci-fi/fantasy book I was reading at the time, my blog name tends to be a bit of a downer (unless you think my blog is all about the Elder Scrolls game, to which you would be thoroughly disappointed), so I’m thinking of renaming it to reflect the more academic nature of it. I mean, some may argue that grad school, especially at the doctoral level, could be conceived as oblivion (right now I’m thinking more along the lines of the third circle of hell, since I’m obviously a glutton for punishment), the subject matter I typically write about it not exactly apparent from the blog’s name.

Because I am braindead most of the time thanks to the stacks of reading I have, I’d like to get reader input. So please, send me your suggestions for a new name for this blog and why you think I should change the name to your choice. Or, if you think oblivion is the perfect name, tell me why I should keep it as is. And if I decide to go with one of these suggestions, maybe I’ll even send the person some delicious Vitak baked goods! (If you are unaware of my baking skills, (1) shame on you and (2) read this post and/or this post.)

So come on people, tell me your ideas!

Is our attention spread too thin?

A Wired post today brought my attention to a new book coming out this fall, Distracted: The Erosion of Attention and the Coming Dark Age, by journalist Maggie Jackson. The book focuses on the Web 2.0 world, where the pressure of an infinite number of technologies has forced us to be at all places at all times, both real and virtual. Phone calls, email, meetings, IMs, Facebook messages, Twitter updates…the list goes on and on. And in our attempt to “keep up” with the ever-changing methods of communication and interaction, we are instead losing a good part of the content, and maybe ourselves. Or, at least that’s what I get from the sensationalist-driven, doomsday-is-upon-us title.

Linda Stone, a well-known technology consultant I had the pleasure of meeting during my tenure at Pew, would call this “continuous partial attention.” In the technologically driven world of 2008, we have no choice but to divide our attention between tasks. It is not only a skill to have, but expected–and sometimes demanded–of us in our daily lives. Whether it is juggling a full-time job and a family life or struggling through a PhD program in hopes of becoming the best professor and researcher out there, we can no longer afford the luxury of focusing 100% on a given task. There are too many demands on us to even contemplate such a life.

Is this a bad thing? Maybe. On the other hand, maybe it’s just not the right way to look at the situation. From the little I’ve read regarding Jackson’s upcoming book, she is right on in many ways. Sometimes I feel stretched so thin that any further addition will surely break me into a thousand pieces. But we are humans, and we can change our future. We do not need to be slaves to technology. Instead, maybe we should consider new ways of harnessing technology to give us the ability to not only step away from it for a few minutes, but maybe even turn it off for awhile.

Turn it off, you say incredulously? Yes, I am just as scared as you. But as much as we let technology be a part of our lives, we should never let it take over our lives.

And that, my friends, is your daily food for thought.

Strangled creativity?

I haven’t blogged much since starting my PhD studies. In large part, this has been due to my busy schedule and, in general, adjusting to a new and very different lifestyle. But I don’t think that’s the only reason. I’ve been feeling less creative lately. Maybe my head is overcrowded with everything else I’m trying to retain, maybe I’m just too tired all the time to think creatively, but either way, it’s not a good thing. I think it’s about time for me to sit down and re-evaluate some things. After all, thinking creativity should in turn benefit my critical thinking and writing, right? Grr…

Regardless, I should probably try to blog more, if for no other reason than to try and respark my creativity.