Feeds:
Posts
Comments

Archive for the ‘The Electronic Age’ Category

Despite the fact that there is no backstage on the internet, the perceived anonymity it offers poses problems when people use that anonymity to attack others. In a new podcast, fittingly titled “Conversations with People who Hate Me,” Dylan Marron will converse with some of the people who have attacked him online. Similar to the messages delivered to commenters by Jay and Silent Bob in Jay and Silent Bob Strike Back but with words instead of physical attacks, these confrontations let posters know that there are real people behind the personas on one’s screen. It will be interesting to see what kind of results these conversations have.

The trailer is below:

Advertisements

Read Full Post »

In some ways, even “family friendly” video games are centered on violent acts. The first moments of Super Mario Bros. (1985), for example, likely involve the player stomping a goomba to death and repeating this act countless times over the duration of the game and its many sequels. There is a qualitative difference, though, between stomping a goomba and shooting a realistic-looking human with a virtual gun. I recently realized the desensitization to virtual violence that can take place over a lifetime of playing violent video games and that it is possible to become resensitized to this violence.

Other than NES games like Contra (1987), my first experience killing virtual humans (and dogs, incidentally) came in the early ’90s while playing Wolfenstein 3D (1992) on my family’s first computer (with a 33 mhz Intel 486 and sans sound card). Its rudimentary 3D corridors were amazing to me at the time and the ability to shoot Nazi guards and see their blood splatter seemed “cool” to my pre-teen self (and his friends). Nothing about the experience (or that of playing Doom (1993) a few years later), however, suggested that killing real people would be anything like it.

Around this time, I remember seeing the game Pit Fighter (1990) in the mall’s arcade and noting its realistic graphics. Pit Fighter used the same digitization process that later made Mortal Kombat’s (1992) over-the-top violence infamous, with real actors playing the parts of its characters. When I eventually purchased a copy of Mortal Kombat, I was sure to get the Sega Genesis version, which had a code that allowed you to see blood like in the arcade. Nintendo seemed to learn that kids like blood, since Mortal Kombat II (1994) was released on the Super NES with blood intact (unlike the virtual “sweat” of the first game), and I bought that version as a result. Again, there was nothing about the experience of playing these games that seemed to reflect real-life murder. Having never been in a fight, I certainly wasn’t going to pull out anybody’s spine or beating heart.

The beginning of college took much of my attention away from video games but I later revisited the Nintendo 64 classic Goldeneye 007 (1997) after buying it at a pawn shop early in grad school. The characters looked like faces pasted on blocks of wood so, again, I never got much of a sense of killing real people. Because of my late arrival I also missed out on the deathmatch aspect of Goldeneye. My first taste of shooting at friends in a video game came with Halo (2001) on the original Xbox. I particularly remember playing on my friend’s huge (maybe 35 inch) SDTV. This, I think, is where my desensitization to virtual violence really began.

During the single-player campaign, Halo players kill multitudes of strange-looking aliens. Multiplayer options allow players to kill each other’s space marine avatars, but helmets and body armor prevent them from looking particularly human. Starting with Halo and continuing with its sequels on the Xbox and Xbox 360, I spent hours killing various aliens, friends, family members, and strangers on the internet. The Gears of War (2006) series on Xbox 360 also tasked players with killing strange-looking aliens creatures from beneath the planet’s surface.

I spent so much time killing aliens and unrealistic-looking humans during these years that I don’t remember thinking twice about killing more realistic humans in games like Call of Duty 4: Modern Warfare (2007) and its sequel (although the sequel’s immersion of the player into a terrorist act got some attention from others at the time). I also played these games with family members and didn’t think anything of “killing” them even when they looked like soldiers.

The last game based on shooting virtual people or things that I played before this year was probably Gears of War 3 in 2011. In the meantime, things like teaching, research, and changing jobs took up much of my time and the time I devoted to video games was more likely to be spent killing goombas in Super Mario 3D World (2013) or driving virtual go karts in Mario Kart 8 (2014). Over the summer, though, my curiosity was piqued by articles about Battlefield 1 (2016) and its approach to World War I. The only issue was that I did not own one of the new video game consoles or a PC on which to play it. Instead, I sought an Xbox 360 launch title, Call of Duty 2 (COD2), that focused on World War II and had gotten good reviews upon its release in 2005.

COD2 is what made me realize how desensitized I had been to virtual killing in the past. As an 11-year-old game, its representations of humans are not incredibly realistic, but I was constantly aware that the avatars I was mowing down with a machine gun represented people (a sense I do not recall having when playing its sequels five years ago). While this was surprising to me, it also worked to heighten the cost of WWII in terms of human life, a point that the creators of Battlefield 1 also make in terms of WWI. One important difference between COD2 and Battlefield 1 (which I have not played), though, is the technological advancements of the past 11 years. Battlefield 1 looks incredibly realistic, which might make the experience more jarring for those who are not desensitized to virtual violence while also furthering the desensitization of those who are. Had I not stepped away from virtual warfare for several years, after all, it is unlikely that my response to it would have been the same.

I would not argue that games like this should not exist. By many accounts, Battlefield 1 does a good job of communicating the horrors of WWI in the same way that a good book or film might. I do think it is important to be aware of how these forms of media affect us and our responses to violence, though. This is particularly problematic in terms of video games because, unlike authors or screenwriters, some video game developers seem to have a hard time envisioning solutions to problems that do not involve violence. The 2014 game Watch Dogs, for example, focused on hacking but required players to shoot a large number of people. Other than The Matrix (1999), I don’t recall a lot of movies about hacking that focused as much on gun violence as actual hacking.

This is unfortunate. Not every game involves virtual murder but it would be nice to see more room for genres that focus on other forms of problem solving. (Notably, Portal (2007) focuses on problem solving while also making light of the callous disregard the protagonists of many games have for those around them.) Adding to the problem-solving toolbox might allow more space for players to be aware of the effects their virtual behaviors have on their perceptions of violence inside and outside of increasingly-realistic game worlds. After all, if all you have is a gun, everything looks like a target.

See also: This related article at Slate that was published after I had written this but before it was posted.


“Like” Memoirs of a SLACer on Facebook to receive updates and links about the murder of Bowser’s seven children via your news feed.

Read Full Post »

Screen Shot 2016-02-06 at 9.23.26 AMLink: https://twitter.com/neiltyson/status/695759776752496640


“Like” Memoirs of a SLACer on Facebook to receive updates and links about the difficulty of sociology via your news feed.

Read Full Post »

Continuing on the topic of student conceptions of research, another issue I have encountered as students conduct literature reviews is the belief that Jstor is the first and last place to look for academic research. This belief seems to be less prevalent at my current institution than my previous one, but many of my past students never even considered looking for sources outside of Jstor due to the convenience of full-text articles.

One problem with this is the fact that Jstor only provides results from the journals in its own collection, artificially limiting the resources that students have available to them to whatever Jstor has been able to negotiate for. (I wonder if students would be equally willing to limit their movie viewing to those that are available for streaming on Netflix, which has similar convenience and limitations.) The second problem is that even when Jstor does include a particular journal, access to that journal is often limited by a “moving wall” of three to five years. There are many topics for which recent publications contain important insights that were overlooked in the past but that students using Jstor would not have access to for several years (I was once accused of not knowing the literature in a particular area because I had not cited an article published a month or so before I submitted a paper to a journal!).

These issues can cause problems but are not lethal to a student’s chances of doing well. A much worse (though much less frequent) problem I’ve had when students use Jstor is that they think of Jstor as the source of the articles they are using. In the minds of some students, they are reading articles from Jstor rather than from The American Review of Criminal Awkwardness because that is where they got their articles. These rare students don’t realize that Jstor is like a shelf holding specific issues of specific journals rather than a publisher of academic information.

As professors, we can begin to address these issues with our students but the ASA citation guidelines can also help by not instructing students to include web addresses for PDFs that they downloaded from online databases. It is time to recognize that the database through which you access a source is not nearly as important as the original source of the source! (A source is a source, of course, of course…)


“Like” Memoirs of a SLACer on Facebook to receive updates and links via Jstor your news feed.

Read Full Post »

There are some jobs that are typically recognized as difficult. Most people, for example, probably don’t think that they could walk into an operating room and be a successful surgeon. Others, however, are often assumed to be easy. Teaching, for example, is something that many people assume they could be successful at. I’ve also seen musicians criticize those who make electronic music because they are “just pushing buttons.” As with teachers and electronic music artists, assuming that somebody has an easy job devalues the work that they do.  Once in a while, though, people have the opportunity to try something that others make look easy, discovering that it is, in fact, rather difficult.

Enter Super Mario Maker.

Super Mario Maker is a videogame for Nintendo’s Wii U game console. In the game, players are able to create their own Super Mario Bros. levels, share those levels, and play levels created by others. In reviewing the game, Sean Buckley of Engadget summed up his experience nicely, stating:

It didn’t make any sense. I’d dreamed about making Nintendo games since I was 6 years old, but when the company gave me the chance to prove a game design genius lived under my skin, I flopped. It was then that a shocking and heartbreaking realization washed over me: I hate making video games.

My ego didn’t take this realization well. As both a hobbyist gamer and a journalist that covers games, I’ve always humored the little voice in the back of my head that said, “I could do this if I wanted. I could make games.” No, Super Mario Maker has shown me, I can’t — not really. Yes, technically I can construct a stage from set pieces I’ve seen in other Mario games, but I’m not really creating anything. My by-the-numbers Mario levels (a few power-ups to start, some pipes to leap over, maybe a Hammer brother or two and a flagpole at the end) feel more like light plagiarism than original content. Why do I suck at this so much?

Michael Thomsen at the Washington Post focused on how bad others are at creating Super Mario levels, arguing:

“Super Mario Maker” is a bad comedy. Released in coordination with the 30-year anniversary of “Super Mario Bros.,” it indulges players in the fantasy that they’d be good at making video game levels. This sort of self-deception has become common in the age of digital consumption, and while there’s something utopian in “Super Mario Maker’s” appeals to community participation and sharing, the game quickly collapses into a scratch sheet of horrible ideas and levels you’ll regret having played. It’s a tool for the mass production of cultural refuse, single-use distractions that fail to replicate the spirit of the original.

So it turns out that the people who have been making the Super Mario Bros. games all these years actually had talents and skills that most of us don’t have. I think this is great! I wish that we could have other opportunities to try what people do in a simplified manner. Imagine Super Teacher Maker where surgeons are given seven hours in a room with 25 eight year olds and asked to teach them math, or Super EDM Maker where a guitar player (or, better yet, a singer!) is given a computer and asked to create music. Maybe then we would start to recognize that everybody has hard jobs, even if our jobs are hard in different ways.

“Like” Memoirs of a SLACer on Facebook to get updates and other posts about how hard it is to be a professor via your news feed.

Read Full Post »

During my second-to-last class period of the semester, I was standing at the front of the classroom talking, as I often do during class, and the light above me went out. The power had not gone out. Nobody had inadvertently hit a light switch. Just one light, directly above my head, that decided it had had enough.

A few days later, when I arrived for my final class in the same course, the clock had stopped working. Its hands resting on the numbers indicating that class would start in ten minutes. The clock couldn’t stand the thought of even one more minute of class.

When I arrived for the final exam period this afternoon I began handing back an assignment from earlier in the semester when a student took a chair from the back of the classroom up to her usual spot. Looking around, I noticed that about eight chairs had decided to abandon their posts and wander off into a nearby alcove.

Professors often make jokes about students, but I appreciate the fact that despite our classroom’s insistence that the semester had ended, my students did not give in.

“Like” Memoirs of a SLACer on Facebook to receive updates and links in your news feed, assuming your news feed doesn’t abandon you.

Read Full Post »

The other day I saw a link online during my “lunch break” (i.e., the time between 12 and 12:30 when LeechBlock allows me to view my usual websites) about the difficulties Peter Jackson and others faced while working on the recent Hobbit movies. The danger, of course, inherent in reading something icon the internet is that it might cause you to read something else on the internet. In this case, I wondered if any fans have edited the three Hobbits into a shorter, more cohesive movie. It turns out that somebody has. Called “The Tolkien Edit,” this version trims many of the parts that were not in the books, resulting in a single four-ish hour movie. The website links to a torrent for the edited version, which is surely illegal but has gotten quite a bit of media attention. “I would be interested in watching that,” I foolishly thought, “But I don’t have anything that allows me to download torrents on this computer.” “Hey,” I continued even more foolishly, “The website has a link to a torrent client. I should click that.”

NOTICE: YOUR ACCESS TO THE INTERNET HAS BEEN SUSPENDED DUE TO ILLEGAL FILE SHARING. PLEASE REMOVE ALL FILE SHARING PROGRAMS AND CONTACT IT SERVICES IN ORDER TO RESTORE SERVICE.

My first thought upon receiving this notice was, “Shit!” My second was that I didn’t even download the software. My third was that explaining all of this to IT over the phone was going to be embarrassing. My fourth was that I had to go e the bathroom and that I should do so before making a phone call to deal with all of this since I had a lot of work to do that afternoon that would require access to the internet. Thankfully, somebody in IT must have noticed that the idiot in this case was a faculty member and restored my network access in the few minutes it took me to walk down the hall to the bathroom and back.

There are two morals to this story. The first is that considering downloading potentially-illegal files from work is stupid. The second is that if they wanted to, the people in IT could probably access a log of every stupid website I’ve ever visited while on the campus network, which makes me consider visiting fewer stupid websites. I can just see the letter reporting the outcome of my tenure application: “Although John has been marginally productive, the committee has regretfully decided not to grant tenure in his case. If even a fraction of the time he devoted to Buzzfeed quizzes had been spent on scholarship, it is likely that this outcome would have been different.”

“Like” Memoirs of a SLACer on Facebook to receive updates and links via your news feed, but maybe not from work.

Read Full Post »

Older Posts »