Interpersonal Political Communication Online

Title art by Animated Heaven.

INTRODUCTION

I am unapologetic about my political beliefs. I also make those beliefs very clear on social media and I do not back down from an argument. Because of this, sometimes I am sent messages like below:
“SURE I'll SEE U AT THE MARCH FOR LIFE MARCH TOMORROW. A LOVING CARING EXPRESSION OF OUR SISTERHOOD NO NASTY HATE FILLED SCREAMING PROFANITY WITH GROSS FILTHY DISPLAYS OF CHILDISH SEXISM AND MISANDRIST HATE C U TOMORROW” - R

Two days after I attended the National Women’s March in DC, I was sent the above message from a member of my church. This was the first thing this man has said to me in well over a decade. I responded:

“Let me ask you something? Do you support using taxpayer money to fund programs for child healthcare like CHIP? How about government programs helping low-income parents feed their children like SNAP? Or that all children should have access to equal educational opportunities regardless of their parent's economic standing?

If you can’t answer those questions, R, then you're not pro-life. You're pro-birth.”

Initially, his response did not engage my question, so I asked him again. That’s when things got interesting.

“Govt programs!! Have nothing to do with what I do about women's rights
Daily i live with women without govt programs and have done so longer than u have been on this earth. Govt programs don't provide a decent
life for all Gods children

Govt protects our rights to be Gods children or any children we wish to b. When u r married and r told u r wife is pregnant and u HEAR the babies heart beat
And u see the baby in her womb and deep in u r gut u feel u r child's soul there is NOTHING THAT ANYONE CAN DO TO HARM U R BABY OR KILL IT. IF YOUR WIFE TELLS U " SHE JUST ISNT READY TO HAVE THIS BABY, u r BABY u will fight with all u r MIGHT to save u r babies life and not let u r wife kill it.

IF U R BABY COULD POSSIBLY BE RETARDED AND THE DR looks at both of u and asks " WHAT DO U WANT TO DO" those r the living moments u LIVE U R BELIEFS, a govt program doesn't do it for u, a march calling marchers NASTY AND USING ALL SORTS OF GROSS PROFANITY MEANS NOTHING. ITS NOT SUPPORTING A WOMANS RIGHT TO KILL THAT BABY

ITS ALL ABOUT U
AND WHAT U DO TO KEEP U R BABY ALIVE. Until u r there, babbling about govt programs and marches with incredibly stupid gross pigs will b u r reality , after u have faced the possible murder of YOUR BABY THEN WE CAN GO TO A MARCH SND LAUGH ABOUT HOW OUT OF TOUCH THESE CLOWNS REALLY R AND BRUTALLY DISCUSS OUR REALITY
PERHAPS WITH U R BABY IN U R ARMS
U SEEM TO HAVE A MIND , USE it don't be led like the rest of the sheep asking for more govt programs and being victims.
GET away from them think for yourself be the free existential agent u were born to be and embrace life not the taking of it.”

The discussion between us is too long for the sake of this paper, but the full transcript can be read here.

While this debate is by no means my first, it was my most entertaining. More than that, this discussion got me wondering about communication theories surrounding interpersonal conflict and the rules that govern those interactions. Specifically, can these theories be applied to digital interpersonal communication and online political discourse?

It’s not difficult to find debates on social media about politics, especially with a new, controversial administration. Facebook, Twitter, Reddit, and the comments sections of news articles all host political conversations. Some are reasonable and thoughtful; others, not so much. The sections below will discuss a few interpersonal communication theories and concepts and demonstrate how many of the same principles are applicable both on and offline.

SOCIAL CAPITAL

Any discussion about the Internet’s role in politics must address social networking. While, understanding social media algorithms and how they govern our interactions, I’d rather focus on a less commonly discussed concept — that of social capital or the value of your connections and relationships. Having both quantitative and qualitative values, social capital is critical in sharing your message (Daly 2011). Your capital is a large factor in many of the social media algorithms when determining the relevance of your content and with whom to share it. Shortly after the 2016 election, the Washington Post published a story discussing Russian trolls and how they spread misinformation about the election. These Russian trolls would create vast networks of social media accounts and websites to spread their message to amplify its volume (Timberg, C 2017). In “Trolling for Trump: How Russia is Trying to Destroy our Democracy,” Weisburd,

Watts, and Berger discuss how Russians would use “honeypot accounts” to build trust and use their network to spread misinformation through American dissenters, especially on the far-right (Trolling for Trump 2016). This misinformation is then shared, connections are made and the sum of all this social capital causes a story to go viral. Social capital wasn’t the only tactic they had at their disposal. It was just a facilitator to a more sinister tool.

INFORMATION MANIPULATION

It goes almost without saying the foundation of political debating lies in information manipulation. Neither party will ever tell a “bald-faced truth” (BFT) — sometimes referred to as “being brutally honest.” In normal interpersonal communication, an aversion to BFTs can help maintain a stable and happy relationship. There’s no need to share irrelevant or mundane information with every party. That same thought process can be applied to political debates. Rarely, do advocates, politicians, or supporters on either side of the spectrum disclose the entire truth. On the other hand, “bald-faced lies” are uncommon as well. These are statements that are not based in truth. Most statements exist in a gray area between truth and lie, this is the arena of Information Manipulation — “the covert manipulation of information along multiple dimensions and as a contextual problem-solving activity driven by the desire for quick, efficient, and viable communicative solutions” (McCornack et. al 2014).

In his work on the subject, McCornack posits that looking at honesty as a dichotomy rather than a spectrum ignores important aspects of communication. As such, he identifies four areas of a message that are commonly manipulated

  • Quantity — how much information is shared
  • Quality — the veracity of the information
  • Manner — the way in which the information is presented
  • Relevance — the importance of the information
  • Manipulating any of theses areas violates “the principles that govern conversational exchanges.” (McCornack 1992).

While information manipulation is a natural part of interpersonal communication, there are potentially dire consequences when conversation norms are violated. People have a truthfulness bias, meaning we are more likely to identify a message as truth rather than lie (Park et. al 2002). Because of this bias, fake news sites began to dominate media coverage during the 2016 election cycle concerning stories about Clinton’s health or inflated accusations about Trump. Often times these stories would provide little information, maybe containing one, out-of- context quote, and would be delivered in a manner that looked like genuine news but would also be styled in an intentionally inflammatory way. This truthfulness bias in conjunction with a confirmation bias — the tendency to agree with evidence that supports your preconceived beliefs — would lead people to share further engage and debate the topic.

Twitter, one of the most efficient interpersonal communication tools, has a breadth of examples displaying information manipulation. The character limits imposed by the platform force journalists, politicians, and supporters alike to ignore nuance and only tweet what they want to highlight. No one serves as a better case study than President Trump. Trump’s large Twitter following before the election and his expert-level use of it during the election inarguably contributed to his electoral success. There are many tweets that can reveal varying degrees of information manipulation. Bald-faced lies aren’t hard to spot either. However, one tweet, in particular, exemplifies McCornack’s quantity, quality, and manner violations.

Yes, it is statically true that 45,000 construction and manufacturing jobs were added in the U.S. Gulf Coast and that $20 billion of investment is going to the region. However, what is not stated is these events would occur regardless of who was president. As Kevin Drum points out on Mother Jones, this investment in the Gulf and the job growth were a result of an investment plan started during the Obama administration (Drum 2017). A reader not versed in economics or current events will read this statement and believe it is a direct result of something Trump has done. This violates “the principles that govern conversational exchanges” by failing to disclose pertinent details. Additionally, this tweet violates the manner of information by not displaying historical data showing growth in jobs and investment over time.

If you open up the replies of this tweet you can see many other violations from the opposition. In one reply, @JordanUhl focuses on Trump’s winning claim and attempts to divert the conversation to the wiretapping controversy and Trump’s perceived hypocrisy with his business practices.

The statements  above violate relevance of the information by changing the subject to a topic the user is more familiar with, thus manipulating the information flow of the conversation. After further replies to J from other users and further relevance manipulation occurs, the topic turns to the Russian controversy. D jumps in and manipulates the relevance of information more to effectively ending the conversation.

Humor

Comedians like Jon Oliver, Stephen Colbert, and Seth Meyers dominated political humor in 2016 and continue to do so today. With their shows cut into pithy Youtube clips, it is not surprising that other users try to emulate their takes and use humor in political discourse. This is accomplished through sarcasm, jokes, or memes. It should also come to no surprise that this method of communicating one’s support or opposition for an issue seldom yields positive results. According to Bippus, humor is rarely well received in conflict because it can come off as an attack or as dismissive. While humor does have the potential to alleviate tension in conflict, it often fails in that attempt.

To first understand humor’s effect within conflict, and thus political discourse, we must first understand attribution theory. It states that recipients react to others’ behaviors based off of their own perceptions, not the intention of the sender. There are a variety of ways a receiver can interpret humor usage. These can be anything from selfish reasons, like trying to relieve one’s own stress, to more altruistic reasons like searching for common ground.

This theory explains why jokes in the comment sections on the Internet can be viewed with such hostility. An important part of passing judgment on one’s humorous intention is the intonation of the speaker and facial expressions — two attributes of communication absent in online discourse.

The second risk for humor online comes down to one simple truth: Most people aren’t comedians. Joke delivery is crucial for it to be effective and achieve the goals of the joke-teller. Subject relevance, its humorous quality, and timing are three key factors of a joke that the Internet can adversely effect. As Bippus cites, “Humor is a nonliteral form of communication in violation of Grice’s maxim of truth...” The nature of humor causes it to violate the rules that govern communication and be more strictly critiqued. If the receiver rates the three key factors listed above as poor, the joke will be an insult to the recipient and “be seen as a diversion from the tules of cooperative conversation.” If the key factors are rated well, the violation is often forgiven.

The third pitfall, and quite possibly the most dangerous, of humor usage in online discourse is everyone has a different sense of humor. It’s easiest to find examples of this on any comment thread from people who advocate for social causes — derogatorily referred to as social justice warriors by the opposition. Social justice advocates have deemed a variety of topics “unfunny.” Ranging from fat shaming to racist jokes, it seems to non-social justice warriors that theres is no topic “in-bounds.” This inability to know another user’s humor preferences leads to misinterpretation, which stifles further dialogue (Bippus 2003).

The attribution theory of humor shows us why humor is risky in terms of conflict communication. As an example where humor goes wrong in online discourse, a connection of mine, M, recently shared this exchange with B. It is easy to view B’s comment as intentionally inflammatory. Likely, it was. However, for the sake of this example, I give B the benefit of the doubt and assume he was trying to make a humorous quip about Obamacare being unfair to the taxpayer. To provide context, this screen share by M is a comment she received on an earlier post about how Obamacare saved her life and her concern about its repeal. Shortly, after this post was made, B was attacked by numerous other people for being callous, despicable, and, in so many words, an asshole.

People’s use of humor in online political discourse isn’t limited to comment sections. Memes are also popular. A well-written meme can be a great way to relieve tension, however, they can also be just as dangerous and ill received as a joke in a comment thread. While memes, in general, are often benign, political memes are designed to attack an opposing viewpoint or politician. In the 2016 election cycle, memes dominated social media and would often be the catalyst for debate. However, because of the offensive nature of political memes, they seldom contributed to insightful, positive discourse, as this classic Willy Wonka meme suggests. The benefit that memes have over a simple joke is that they can provide additional cues for the receiver to attribute from. Seeing facial expressions, or being well versed in meme culture, can provide leeway for the sender and minimizing the risk to offend.

While humor can be useful to relieve tension when engaged in conflict, the limitations of online communication increase risk by failing to provide attributes necessary for receivers to perceive humor as it was intended. Fortunately, adept Internet users have developed shorthands like “/s” or “lol” to add to the end of jokes to clearly identify when humor has positive intentions.

POST-INOCULATION TALK

Excluding Internet trolls, the end goal of any discourse is to influence the other side and hopefully bring them to yours. As such, it would be remiss to not discuss postinoculation talk (PIT). To understand PIT, inoculation theory must be understood. Inoculation Theory provides a way to understand how to influence others and help create resistance to the opposition in an individual. By explaining a threat to an individual and then providing a refutational statement, the subject is more likely to resist further attempts of persuasion from the other side. PIT takes the concept one step further by studying the social aspect of inoculation. Because a threat is uncomfortable, it suggests that inoculated individuals are more likely to discuss the threat component and then also the details of the refutational statement with their friends and family — further spreading the inoculation.

After receiving the first inoculation, an individual will likely have an internal dialogue, which will serve to further solidify the opinion and build a stronger resistance to oppositional persuasion. However, eventually, the internal dialogue will manifest externally and the inoculation will spread. More so, this external conversation is believed to strengthen resistance even more than the internal dialogue. The process spreads to others which in turn, if not previously inoculated to the contrary, will spread.

Another effect of inoculation is that it boosts confidence in minority opinions, alleviating the “fear of isolation,” and encouraging the spread of those minority opinions (Ivanov et. al. 2012). This effect, in conjunction with increased talk surrounding negative affects, can be credited with a rise in a variety of minority opinions in American politics. Ideas like racism or socialism have long been on the fringe in American dialogue, however, these beliefs have risen in popularity due to the Internet and the ease of inoculation. This has encouraged many previously silent members of the electorate to get involved — for better or worse. Anger with the establishment gave credence to groups that have been shunned from mainstream discourse. Anger, combined with bolstered minority opinions and inoculation contributed to the rise of the Never Hillary movement, which could be considered a contributing factor leading to Clinton’s loss in the 2016 election. As AP reported in July of 2016, “They heard the pleas for unity. The dire warnings about a Donald Trump presidency. The tributes to Hillary Clinton as a champion of the downtrodden. And they were not moved” (The Associated Press 2016). This introduction perfectly demonstrates why some Sanders’ supporters resisted Clinton’s attempts to unify the party. Throughout the primary, these “Bernie Bros” had been given inoculation messaging that provided countless refutations against Clinton’s ethics, trustworthiness, and ability to lead. When Clinton won the nomination, members of this movement were furious at what they perceived to be a sleight against democracy, unfair primary practices, and corruption. Whether these claims are substantiated or not, the fact is Clinton earned almost four million more votes than Sanders. He lost the popular vote (2016 Democratic Popular Vote). The threat of Clinton was perceived to be so great, they resisted attempts to prove she was a lesser threat than Trump.

How does this translate to online political discourse? The sheer number of voices on the Internet sharing articles, memes, and opinions, are constantly attempting to inoculate others to one side or the other. Additionally, with each click of the share button on Facebook, a user is further strengthening their resistance towards influence. The ease of sharing inoculation attempts on social media is where PIT achieves dangerous levels. People’s social networks are larger than they were in 2012 when Ivanov et. al published their findings on PIT. Thanks to the Internet, PIT is amplified to higher levels. As more inoculation attempts are made, more PIT occurs, and stronger resistances are built. This explains why some people believe debating online to be a waste of time because opinions will likely never change.

CONCLUSION

While much of interpersonal communication research is focused on in-person communication, many of the same principles and theories apply to online communication both in private and public debates. Information manipulation often occurs in the same way with all of the same violations, humor becomes riskier and thus also a weaker tool for conflict, social capital is amplified to extreme levels, which in turn strengthens PIT. Writing this I realize that I have painted a grim picture of the value of online debate and even may make you long for the days when cat photos dominated the Internet and social media no longer invaded our lives. However, this was not my intention. As a regular online debater, I hope to leave you with ideas on how to improve the level of discourse online and make it live up to its potential of a common meeting ground for all people to share their ideas. By understanding how people manipulate information you can better identify truths, lies, and mistruths and be able to counter those. On the flip side, it can also help you create stronger arguments. Likewise, avoiding humor lowers? the chances to offend another person and helps facilitate a deeper conversation. Third, having a strong understanding of your social capital can help defend your arguments by having a variety of voices amplify yours and share your message to a broader audience. Finally, in tandem with your social capital, using your message to inoculate individuals and relying on PIT to help strengthen your position will help your opinion cut through the noise on the Internet. Grasping the theories laid out in this paper and applying them to your communication will raise discourse in your corner of the Internet, hopefully, spread to others and, with a little luck, make the Internet great again.

WORKS CITED

2016 Democratic Popular Vote. (n.d.). Retrieved April 23, 2017, from http:// www.realclearpolitics.com/epolls/2016/president/democratic_vote_count.html

Bippus, A. M. (2003). Humor motives, qualities, and reactions in recalled conflict episodes. Western Journal of Communication, 67(4), 413-426.

Daly, J. A. (2012. Chapter 8: "Network!". In Championing Ideas and Influencing Others: The Authoritative Guide to Selling Concepts, Compelling Action, and Getting Results (pp. 167-187). New Haven, CT: Yale University Press.

Drum, K. (2017, March 8). Trump touts Obama job creation in tweet. Retrieved April 23, 2017, from http://www.motherjones.com/kevin-drum/2017/03/trump-touts-obama-job-creation- tweet-today

Ivanov, B., Miller, C. H., Compton, J., Averbeck, J. M., Harrison, K. J., Sims, J. D., . . . Parker, K. L. (2012). Effects of Postinoculation Talk on Resistance to Influence. Journal of Communication, 62, 701-723. Retrieved April 23, 2017.

Mccornack, S. A. (1992). Information manipulation theory. Communication Monographs, 59(1), 1-16. doi:10.1080/03637759209376245

Mccornack, S. A., Morrison, K., Paik, J. E., Wisner, A. M., & Zhu, X. (2014). Information Manipulation Theory 2. Journal of Language and Social Psychology, 33(4), 348-377.

Park, H. S., Levine, T. L., McCornack, S. A., Morrison, K., & Ferrara, M. (2002). How People Really Detect Lies. Communication Monographs, 69(2), 144-157.

The Associated Press (2016, July 27). AP EXPLAINS: What's driving the Never Hillary movement? Retrieved April 23, 2017, from https://apnews.com/ fec897ac798d4f2d928f040cb0d0ff54

Timberg, C. (2016, November 24). Russian propaganda effort helped spread 'fake news' during election, experts say. Retrieved April 23, 2017, from https://www.washingtonpost.com/business/ economy/russian-propaganda-effort-helped-spread-fake-news-during-election-experts-say/ 2016/11/24/793903b6-8a40-4ca9-b712-716af66098fe_story.html

Trolling for Trump: How Russia Is Trying to Destroy Our Democracy. (2016, November 05). Retrieved April 23, 2017, from https://warontherocks.com/2016/11/trolling-for-trump-how- russia-is-trying-to-destroy-our-democracy/

INTERPERSONAL RELATIONSHIPS 14