Semiotics as a “Conceptual Crowbar” to Analyze Codes

What are Codes?

Daniel Chandler’s book Semiotics: The Basics offers a crash course on a varied, intricate, and seemingly impenetrable discipline. In his second edition (2007), he writes a chapter focusing on the concept of codes as they relate to theories of semiosis. Chandler defines codes as “set[s] of practices familiar to users of the medium operating within a broad cultural framework” (148). According to Chandler, “Codes organize signs into meaningful systems which correlate signifiers and signifieds through the structural forms of syntagms and paradigms” (147). Signs that are grouped on “the basis of commonality and membership” (Kim 57) are considered paradigms, while signs that are specifically combined with intent, much like a sentence, are syntagms. Paradigm and syntagm are the vehicles that allow signs to carry meaningful messages and thus allow for communication between people.

While paradigmatic and syntagmatic structures aren’t the only ways to understand codes, Chandler emphasizes their role throughout his chapter. Meanwhile, in The Quest for Meaning, Marcel Danesi elaborates on “associative structures,” which analyzes how abstraction and metaphors encode meaning. Danesi additionally claims that codes have three major components: representationality, interpretability, and contextuality. In other words, when signs combine into codes to represent something, that representation contains a message which can be interpreted by anyone who is “savvy” with the code, and finally, the message is embedded within particular social contexts. Codes come in a variety of types and each author presents their own taxonomy. Chandler’s overview of codes categories includes social, textual, and interpretative codes. Danesi, on the other hand, focuses his taxonomy on social, mythic, knowledge, and narrative codes. Kyong Liong Kim, finally, addresses logical, aesthetic, and social codes. While each taxonomy varies, the underlying principle is that codes undergird domains across time and space that influence social structure and organization.

Codes affect and even regulate all sorts of behavior, from dress and attire to determining what’s appropriate to say in a given social context. Many social codes are learned early in life to the point where “having internalized such codes at a very young age, we then cease to be conscious of their existence. Once we know the code, decoding it is almost automatic and the code retreats to invisibility” (Chandler 166). Danesi makes a similar connection when he writes, “once the ‘objects’ have been encoded by language (or some other code) they are perceived as ‘necessary’ or ‘natural’ discoveries of reality, not just as convenient signs” (79). Both Chandler and Danesi contend with perceptual codes and how these codes might map onto theories of epistemology and ontology. Addressing the Sapir-Whorf Hypothesis, both authors explore the notion that people who are exposed to certain codes within a culture tend to be predisposed to “seeing” the world a certain way. Though the strong version of the hypothesis is not usually accepted, there seems to be a connection between language and its influence in shaping cultural codes. Considering codes operate “under the surface,” so to speak, in that they are usually tacit and interpreted automatically, they have a great power to shape perception.

How do codes operate in practical contexts?

Chandler offers a fascinating example of codes at work when he describes how film makers use the “eyeline match” technique in editing film (168). Essentially, the technique is based on sequencing shots so that something a character is gazing at off-screen will be represented in the following shot. Using this gives the movie a sense of continuity, which enhances its “realism” for the viewer. This, in turn, helps render the editing “invisible” because viewers already watch these movies with an expectation (through genred reading practice) that these shots will take place. As Chandler aptly puts it: “The seamlessness convinces us of its realism, but the code consists of an integrated system of technical conventions (166). Other editing techniques, as described in this MIT film lexicon, all make up a code for understanding how movies are constructed from a production viewpoint. However, the “cinematic editing code has become so familiar to us that we no longer consciously notice its conventions until they are broken. Indeed, it seems so natural that some will feel that it closely reflects phenomenal reality and thus find it hard to accept it as a code at all (Chandler 168). Thus, audiences become accustomed to interpreting film in certain ways, based on how the codes for its production are “naturalized”, which can affect their interpretations of the text and subsequent social interactions.

Screenshot from The Matrix (1999), depicting the computer codes which makes up the main program.

Screenshot from The Matrix (1999), depicting the computer code which makes up the main program. Credit: CinemaSquid

Leveraging the Conceptual Crowbar

Chandler’s provides his most poignant metaphor of the chapter when he writes: “Semiotics offers us some conceptual crowbars with which to deconstruct the codes at work in particular texts and practices, providing that we can find some gaps or fissures which offer us the chance to exert some leverage” (173). This particular hypothesis on how codes function reminded me of The Matrix movie seriesThe major premise behind the films (spoiler alert) is that humans are actually “plugged into” a virtual world created by artificially intelligent machines. Neo, the main character, initially is conditioned to see a reality that represents life in America in the year 2000. However, as Neo meets people who live outside the Matrix (in the “real” world) and begins to develop his own powers, he finally gains the ability to “see the code.” He no longer sees the representation of the world the machines transmitted; instead, he literally sees the green symbols that comprise the Matrix program. As Chandler notes, “Understanding what semioticians have observed about the operation of codes can help us to denaturalize such codes by making their implicit conventions explicit and amenable to analysis” (173). Danesi adds that “although human beings are indeed shaped by the cultural system in which they are reared, they are also endowed with creative faculties that allow them to transcend it and even change that very system” (95). Like Neo, we’re born into a specific culture with codes that we have no control over; yet, we still possess the power to change and modify the codes as they are not static, but dynamic. I wonder, however, if there are some codes that are so seemingly “natural,” and so entrenched that changing them would upend the entire social structure?

References

Chandler, Daniel. Semiotics: The Basics. Routledge, 2007.

Danesi, Marcel. The Quest for Meaning: A Guide to Semiotic Theory and Practice. University of Toronto Press, 2007.

Kim, Kyong Liong. Caged in Our Own Signs: A Book About Semiotics. Greenwood Publishing Group, 1996.

Translingualism: A Model for the Globalization of English

Canagarajah’s Model of Translingualism

Theories for describing, analyzing, and predicting the spread of English around the world have constantly undergone evolution and scholarly debate. In his book Translingual Practice: Global Englishes and Cosmopolitan Relations (2013), Suresh Canagarajah outlines his argument for how English should analyzed as a translingual practice. In chapter four of the book, “English as Translingual,” Canagarajah offers his critique of major theories of English as a global language, including “World Englishes” (WE), “English as an International Language” (EIL), and “English as a Lingua Franca” (ELF). In opposition to these models, Canagarajah challenges the analytical focus of categorizing, classifying, and defining the seemingly endless varieties of English. Instead, he argues that a “more productive undertaking is to identify the processes underlying the construction of all these varieties” (59). He believes that previous models have focused too much on the “product” of varietal Englishes. As he puts it, “These models end up reifying each variety, limiting further changes, and preventing us from being open to studying further diversification. The focus on the product also takes away our attention from the processes of contact, mobility, and sedimentation that underlie these varieties. It also prevents us from understanding the dynamics of meaning-making practices” (56). In other words, models like Kachru’s Inner/Outer/Expanding Circle fail to account for the fact that English is primarily a “contact language” (56).

Canagarajah complicates Kachru’s “circle” theory by articulating how interactions within the circle and at the borders of the circles require an analytical frame of English as a “communicative practice” rather than as “stable [varieties]” (69). He gives an interesting example of two individuals who have to engage in meaning-making practices in order to communicate effectively about a large order of cheese. Canagarajah’s analysis of how they negotiate power dynamics in order to communicate focused heavily on the use of the word “blowing.” He claims that “from a semiotic perspective, even the use of the same form or vocabulary item in a different context may take new indexicality” (69). In other words, the use of the word “blowing” pointed to a new, contingent referent. Considering contemporary semiotic theory suggests that the signifier/signified relationship can never have prescriptive 1:1 matches, this example lends credibility to the notion that what happens in the “contact zones” of language should be a focal point for understanding how English works in communicatative situations.

Close detail of Wikipedia logo.

Wikipedia, the “global” encyclopedia, boasts over nearly 4.5 million articles in English – more than any other language.

Globalization’s Role

Canagarajah’s argument resonates with the argument Alastair Pennycook puts forth in his piece, “English and Globalization.” Pennycook, citing Giddens, claims that globalization “may be better understood as a compression of time and space, an intensification of social, economic, cultural and political relations and a series of global linkages that render events in one location of potential and immediate importance in other, quite distant locations (114). Rather than focus on English through the lens of the nation-state, Pennycook, like Canagarajah, accepts that global English use operates in an “uneven world” full of power struggles, but that we need to “[question] the ways in which we have come to think about languages within colonialism and modernity, and regarding the grand narratives of imperialism, language rights, linguae francae or world Englishes with suspicion, this perspective looks towards local, situated, contextual and contingent ways of understanding languages and language policies” (121). In other words, simply labeling each variety of English (e.g. “British English” vs. “Indian English”) is ultimately less productive, since it neglects the complex “flow[s] of information” (114). Certainly, Pennycook doesn’t deny the historical evolution of English, its ties to colonialism, and its potential role for cultural homogenization, but he believes that Robert Phillipson’s argument of English as a force of “linguistic imperalism” misses the point about globalization. Instead, Pennycook posits that “it is also crucial to understand the ways in which English is resisted and appropriated, how English users ‘may find ways to negotiate, alter and oppose political structures, and reconstruct their languages, cultures and identities to their advantage. The intention is not to reject English, but to reconstitute it in more inclusive, ethical, and democratic terms’ (citing Canagarajah, 1999).

Both Pennycook and Canagarajah seem to operate with a similar analytical lens of getting a “ground level” view of English with an emphasisis on the descriptive orientation of sociolinguistic study instead of a classification and prescriptive lens. For both Canagarajah and Pennycook, we need better understandings of how English is used to communicate, rather than simply categorizing and classifying based on linguistic forms and difference. Thus, both scholars challenge previous theories and ask us to consider a more radical framework that moves away from “twentieth-century epistemologies” (Pennycook 121). As Pennycook argues, “globalization requires us to consider whether we should continue to think of languages as separate, distinguishable, countable entities” (116). Indeed, breaking out of this idea of languages, particular varieties of English, as discrete has numerous potential counterarguments, including some of the following objections:

  • What methodologies would be valid for studying language from a “translingual” perspective? Valid according to whom?
  • If social and personal identities are tied to language, then what happens if we argue that languages are no longer discrete but just a messy, translingual blur?
  • If a strong form of translingualism is accepted, how might teaching pedagogy account for the displacement of any notions of standards (i.e. teaching commonly accepted prescriptions)? In other words, what do we say to students who ask us to “tell them what’s wrong about their English” so they can communicate more effectively in high-stakes contexts, like the workplace?

It remains to be seen how “English as a Translingual Practice” might shift knowledge in fields from linguistics to composition to pedagogical practice, but it appears that this model is a perfect candidate for further research and inquiry.

References

Canaragajah, S. Translingual Practice: Global Englishes and Cosmopolitan RelationsRoutledge. 2013.

Pennycook, A. “English and Globalization” The Routledge Companion to English Language Studies, Routledge. 113-121. 2010.

 

John McPhee Muses about Writing: “Get Your Own Hamper”

Image of author John McPhee. Credit: Office of Communications, Princeton University

Image of author John McPhee. Credit: Office of Communications, Princeton University

I recently read a fascinating article from a 1991 issue of College Composition and Communication called “The Strange Case of the Queen-Post Truss: John McPhee on Writing and Reading” by Douglas Vipond and Russell Hunt. McPhee is well known as an essayist who has written extensively for The New Yorker and he has been a major influence in the genre of creative non-fiction.

Methods

The authors employed two unique methods to gather information from McPhee, both of which yielded in-depth, honest results. As Vipond and Hunt state, these methods illuminate “knowledge about [writing] choices and strategies that… often remains tacit” (201).

  1. Discourse-Based Interview: Essentially, the interviewer brings in writing samples from an author’s original text along with alternative, modified versions of that sample (often a sentence or two). Then, the interviewer asks the author if they’d substitute an alternative for the original; if not, then they are asked to explain the “writerly logic” of why they prefer the original.
  2. Probes: The interviewer brings in statements about an author’s work made by general readership. The interviewer can then use these comments as a springboard to reveal dynamics in the writer-reader relationship.

McPhee’s Insights

Throughout the interview, McPhee responded to a number of proposed alternatives and reader interpretations. Here are some my favorite quotes:

  • “If you can find a specific, firm, and correct image, it’s always going to be better than a generality” (203).
  • “But, actually, a touch is sometimes more helpful than a mallet stroke” (204) – on sentences that “carry a touch of humor.”
  • “If you trust your writer you can appreciate something without knowing what it is” (204) – on some of his readers not knowing what a “queen-post truss” was.
  • “These little affections, you don’t give them up” (206) – on picking up unusual phrases like “queen-post truss.”

McPhee also described some details about his research process for an essay he wrote on a forest that he discovered much later in life despite growing up nearby. In response to a reader’s comment that he used “too many facts” in the essay, McPhee asserts:

You know, I spent a week or so simply talking about this subject with people, and therefore had collected a great big hamperful of material about it. Then I turned around to do a piece of writing that would include what I thought was worth writing about, worth passing on, worth attempting to form into sentences and shape as a structure – an integral piece of writing. And then I turned to this reader and the reader said, “There are too many facts.” All I can say is, “Go get your own hamper.” (208).

Personally, I love that line: “Go get your own hamper.” Yes, it’s a witty shot fired at a random reader, but McPhee makes an excellent point. Writing is fundamentally a craft. Regardless of genre or purpose, when we write, our brains literally craft strings of symbols together to make some sort of message. Sometimes, that message misfires and the recipient misinterprets it or doesn’t apprehend it in the first place. Those moments can frustrate a writer, but McPhee realizes that the writer-reader relationship is always messy. In his view, as long as he crafts a piece that has been thoughtfully researched, weighed, and considered, then he’s doing his job as a writer.

Writing with “Rhetorical Hampers”

For me, that metaphor of the hamper is a great way to discuss my own writing process (and process in general). I realize that no matter how much stuff I throw in the hamper, there’ll always be a missing sock here or a lost shirt there. My hamper will never contain anything that can ever approximate “Truth” (with the capital T). So yes, I can accumulate a great deal of material, but it’s ultimately the “sorting and folding” of that material into intelligible thoughts that requires the most energy and determination. McPhee’s domestic metaphor seems less abrasive to me than another commonly uttered metaphor of writing as “refining raw material” that the industrial age imparted on composition. To me, refining makes a case that information is simply processed through a series of conveyer-belts on the assembly line. That writing is a linear, repetitive process that can be completed in eight hour shifts. The hamper analogy, instead, views writing in more realistic terms: writing as a process of going through various states of cleaning, drying, folding, arranging, sorting – but ultimately wearing.

Isn’t that what we do with knowledge all the time?

 

 

Thoughts on the Essay

For this quarter, I’m taking a course on the essay genre and during my class tonight, our professor posed the following question: “What do you think when you hear the word essay?” We were given about ten minutes to write a short response and share with each other. The discussion afterward was illuminating to me, because my peers had such varied mental images and associations with the term, ranging from essay as standardized testing component to essay as form of personal liberation. Of course, many other interpretations and perspectives came out of that conversation, which we anchored to our readings on Montaigne’s conception of the essay as a “trial” of his judgment (“On Democritus and Heraclitus“).

Here was what I wrote in response:

In my experience, the word essay brings to mind an entry point to a particular discourse, often academic in nature. For me, the essay represents a way to participate in meaningful communication with a potentially large, yet unpredictable, audience. Back in high school, I used to think of essays as a stressful school assignment, but over the years, I’ve learned that they are engines for sustained thinking and meditation, which are fueled by personal experience and imagination. While I don’t think the essay is the “ultimate” genre for discourse, I do believe its sustained influence has to do with the fact that it connects the reader to other worlds and possibilities. Ultimately, I view the essay as a symbol for constructive cognitive dissonance between my reality and that of another individual’s reality.

I then started to write a couple of sentences about essays as a metaphor for a playground of ideas but time ran out and I left the thought on the cutting room floor. Regardless, I’m curious to see how my conceptions will change during this quarter as I read more essays and write a few of my own. Consider this a “working definition” for now.

Go For Launch: Another Flight into the Blogosphere

This is the second site that I have launched in the past year; the first was my blog, Pixel RhetoricaThat project started off as a WordPress.com site that I migrated to a fully-functional WordPress site with my own server space. The blog was originally intended for exclusive analysis of the video game industry through the lens of rhetoric and discourse theories, which I’m learning through my M.A. program. Recently, I decided to expand its focus to include more of my interests in new media topics. But I still felt something was lacking: my own professional site.

So why make this brand new site? Two reasons. First, I wanted a separate space for my writing portfolio and teaching materials, along with a wider net to post on my experiences through graduate school. I felt these wouldn’t integrate very well on a specialized blog. The second major reason is that I’m realizing how important it is to “take control” of your online presence. If you type in the search phrase “stay off the grid,” you’ll find plenty of how-to guides and blog posts for avoiding online surveillance, assorted privacy threats, etc. However, after reading The Daily You by Joseph Turow, I’m not sure it’s truly possible to avoid the tentacles of data collection. Turow’s book tracks the process of online advertising and how data on consumers gets collected, sold, exchanged, and used throughout a massive, complex system. He particularly fears that the data advertisers collect on everyday citizens will be used as “reputation silos,” which could lead to forms of social discrimination. From an interview on NPR, he says these potential silos have “ramifications of how we see ourselves and how we see other people… and this is part of another issue we have to think about, which is information respect. Companies that don’t respect our information and where it comes from are not respecting us, and I think moving into this new world, we have to have a situation where human beings define their own ability to be themselves.”

For me, “going off the grid” is a truly unrealistic prospect, both personally and professionally. Instead, I’d rather fight the potential threats of identity silos by defining my own identity rather than letting others do it for me. Sure, this blog and my other online presences are a feast for data collectors. But at least the writing and data I put on here will be filtered through me first. Ultimately, I can’t necessarily control what comes up when someone searches “Jeff Melichar” in Google, but I have some control (albeit limited) to use the powers of rhetoric and technology to present myself to the world the way I would like.

Go to top