As I love Apple, I figured I’d do one of their products for my Heuristic comparison.  Yet I didn’t want to do the iPhone or iPod, since those have been analyzed to death, so I instead decided to focus on Mac OSX.  This is one of their products which is most often overlooked, but is just as important as the aforementioned iThings.  I am both a PC user (cost and compatibility) and a Mac user, but in the end I’ll always prefer Mac OSX.  Here is its Heuristic evaluation:

1)  Visibility of System Status. When using Mac OSX, the system status is always readily apparent.  Open applications have a blue dot below them to stand out, the active window is dark while the others are faded, and the always-visible Menu Bar at the top will change to reflect in what program you are.  If it’s a MacBook, battery life is displayed in the upper-right.  In both Macbooks and desktop Macs, other indicators are also displayed in the upper-right, such as the time, wireless Airport status, and Bluetooth status.  If the Mac ever gets overloaded and freezes up, the user definitely knows because the dreaded Spinning Ball of Doom replaces the mouse cursor.  Knowing what is going on with your Mac is never a problem.

2)  Match Between the System and the Real World.  Since this is an operating system, it doesn’t exactly resemble anything in the real world physically, but elements within do correlate to real-world equivalents.  One example of this includes the ubiquitous office naming system – i.e. files, desktop, and some Mac-unique concepts like Stacks (multiple-file organization on the dock) and Spaces (multiple desktop workspaces).  There is also the dock, which houses the most-used applications and resembles a boat dock.  Icons in Mac OSX also look very much like their real-life equivalents, right down to the Mac Hard Drive icon.

3)  User Control and Freedom.  The user is free to do as they please and has complete control over all shortcuts in Mac OSX.  He can open any number of applications and/or windows that he pleases, customize keyboard shortcuts for features like Expose, and adjust just about anything in System Preferences.  If he wants to close a window but keep the application running for easy future use, he can simply close the window and Mac OSX keeps it running until he actually “quits” out of it.  Desktop and screen savers can be customized heavily as well, including not only the desktop image itself, but the organization of files on there (this may seem like a no-brainer, but it is important).  Hard drive contents are easily searchable with Spotlight.  Workspaces can be changed with the Spaces feature.  And let’s not forget Time Machine, an automatic backup system built into 10.5 and later.  With this feature, not only can the user back up when he wants to, but he can “time travel” to recover a file from any point in the past, like if he changed something, saved, and decides he wants the earlier version instead the next week.

4)  Consistency and Standards.  Starting with Mac OSX 10.5 Leopard, all windows and Apple-built applications in the OS have the same uniform look and feel to them.  The Menu Bar at the top is always there, only changing contents to reflect each program.  Icons for all programs have the same look and feel to them, no matter by whom they were developed.  In all the Apple-supplied programs, graphics are similar, like with iTunes’ Coverflow and iPhoto’s photo-viewing options.  Everything has the same polish and clean design.

5)  Error Prevention.  This is HUGE on Mac OSX, and one of the reasons why I like it so much.  It has error prevention built into the core.  There are virtually no viruses, despite the rising number of Mac users, and not nearly as much spyware can glom onto your system like on Windows.  De-fragmentation is basically unneeded except in the direst of cases, since the OS does it in the background for you.  And the OS is stable.  Programs may crash sometimes, and the computer may get overloaded like any computer, but the vast majority of the time Mac OSX will not crash.  There are no Blue Screens of Death.  If a program is acting unruly, the user can right-click on the program’s icon in the dock and Force Quit it, no Ctrl+Alt+Delete needed.  Simple as that.

6)  Recognition Rather than Recall. This is something Apple had in mind from the very beginning when they designed the Mac OS.  Unlike Windows, which was developed by and for engineers originally, Mac OS was developed to be easy to use for the average person.  Once one goes through the basics of how to navigate files and programs, it is stupidly easy to do it again.  And yet, for one used to Windows, Recall may be much more strained when making the switch.  The Menu Bar is at the top instead of the bottom, system options have to be accessed through the Apple icon, the close button for windows is in the upper-left instead of the upper-right, and the OS is application-based, not window-based like Windows.  This last point can be particularly disorienting for someone who is used to the fact that the program is closed when the window is closed, for it is not so on Mac OSX.  But all these problems are not really that big of a deal after a little getting used to.

7)  Flexibility and Efficiency of Use.  Here we have two conflicting points.  While the OS itself is flexible (see above’s customizable examples), the actual development is not.  Most people, myself included, do not see this as a problem since we’re not programmers, but to the development community Apple’s tightly-controlled, closed-source platform is stifling (Windows is also closed-source, as an aside).  This means no one can do anything to change it except Apple, so technically this is very inflexible.  I suppose that’s why Linux is around.  But Mac OSX is very efficient, especially 10.6 Snow Leopard, which has a very tiny footprint on the hard drive and is easy on resources.  Using the OS is also efficient, since there is no bloatware installed and there is no slow-down (with the obvious exception of running highly-intensive programs).  Searching is a breeze with Spotlight.

8)  Aesthetic and Minimalist Design.  This is Apple’s mantra.  Avoid excess.  Everything in the OS is highly polished, clean, and aesthetically pleasing to the eye.  Soft gradients are used extensively in windows and buttons, and, unless the user FUBARs the desktop or dock, everything there is also neat and tidy.  It’s so well done that the latest Windows iterations have been imitating it in terms of shiny design and smoothly-flowing animations.

9)  Help Users Recognize, Diagnose, and Recover From Errors. In the event a problem occurs, Mac OSX is very helpful in providing the user with information.  If a program crashes, a dialog box pops up and tells the user their options (send error report, re-start program, etc.).  If your Mac is frozen, you’re definitely told so by the lovely little spinning beach ball.  Also, if there are connectivity problems or something of the like, the OS will guide the user through what to do, such as if the computer can’t connect to the internet and the Network Manager automatically kicks in.

10)  Help and Documentation.  With every Mac comes both documentation and the OSX backup discs, so if something REALLY messed up happens, all is never lost.  There is also the Help option in the Menu Bar at all times, no matter what program.  And if you really can’t figure it out, Apple’s website is chock full of help documents and forums for solving the problem, not to mention Apple Stores and their Geniuses.  OSX is well-supported.

So, after evaluating Mac OSX with the Ten Heuristics, I’d have to say it scored pretty highly.  Aesthetically pleasing, easy and efficient to use, powerful and stable, and well-supported, once one gets used to Mac OSX it is hard to think any other system is better.  It’s not perfect, but I’d say the reason why most people don’t like it is simply unfamiliarity, or a general dislike for Apple clouds their judgment.  Can’t really fault the OS for that, though.

Mac OSX Snow Leopard screenshot

My topic was that of Al Gore, and why he says he invented the internet.  So, why does he say that?  Because it’s true, or, at least, mostly true.  While he wasn’t the programmer guy who actually made the first code or anything like that, he was the key political figure who created legislation that made the internet come to be.  This was called the High Performance Computing and Communication Act of 1991, or the “Gore Bill,” because it was Mr. Gore who created and introduced it.  When the bill passed on December 9, 1991, it led to the creation of the National Information Infrastructure, which Gore called the “Information Superhighway.”  Among other things, it also led to the creation of the first web browser, Mosaic.  Gore decided to make the bill in the first place because he heard a report from Leonard Kleinrock of UCLA in 1988, who was a key creator of ARPANET, a predecessor of the internet.  Thus, Gore heard of this concept, realized its potential, and passed legislation which led to a widespread adoption of the technology.

Sources:

http://en.wikipedia.org/wiki/Al_gore (I know, WikiPedia, but I also used it to find the below sources)

http://www.computerhistory.org/internet_history/internet_history_90s.shtml

http://www.nal.usda.gov/pgdic/Probe/v1n1_2/info.html

New Media Blawgings

October 23, 2009

The podcast about new media and the reading about The Medium and the Message both made good points.  Probably some of the most interesting moments of the podcast were when they talked about how new media is affecting our culture, which got me to thinking.  At first I was of the opinion that new media, while innovative, annoys me because people become obsessed with it, detaching themselves from nature, themselves, and each other.  I was thinking about Twitter and Facebook in particular.  People, especially them youngin’s, text each other dozens of times a day.  They don’t even talk anymore.  What does this do to our social nature? But then, as I heard these wizened professors discuss how they like and use these services, I had to stop and listen.  One of them talked about how we’re adaptable, and that new media is just another element to absorb into our psyche, but it ultimately won’t change us.  We didn’t exactly devolve into anti-social weirdos with the advent of the telephone, which could have been viewed as impersonal by some when it became popular.  “People don’t even talk face to face anymore,” they probably said.  But it’s integrated in to our society now and we are pretty much the same.  So it will be with these new media services, they say in the podcast.  I can’t help but have to agree there.  I still don’t like it, but it’s not as bad as I thought.  I got similar ideas out of the reading – that we’re obsessed with conveying messages, so despite these new ways of doing that we’re still going to do the same thing humans have always done and we won’t necessarily be any worse for it.  Just a new way of doing things.  That’s progress, I guess.

So, how do these things affect the gaming industry?  Good question.  Perhaps people can follow a gaming company’s Twitter blog and get the latest updates on a game’s secrets, thus building hype for a game.  Perhaps people can use Facebook or Twitter to network about a new game and spread word of mouth.  From this perspective, as someone who wants to make games for a living, this is a very good thing.  Still doesn’t mean I have to like Twitter. :p

As for how the field of video game development relates to traditional media, I’d say television has had the biggest effect, along with Print Media (i.e. gaming trade magazines like EGM).  Television, being the primary vehicle for advertisement and the receiving of information, affects most every gamer there is.  We see ads for the latest games and game systems.  We see gaming channels like G4 report on gaming events.  We even sometimes see the regular news make a mention of a new gaming fad or event.  Thus, besides the internet (which is new media), TV makes the biggest impact on how gamers and game developers alike receive their information about what’s going on in the world of gaming.  Print Media does as well, but, along with their newspaper brethren, are rapidly giving way to the Internet for a means of getting information.

New Media, on the other hand, is essentially tied into the bloodline of the gaming industry, both for gamers and game makers.  In particular, above all else, I mean the internet.  It is the single most common, quick, efficient way to get information about gaming, whether it be from gaming sites like IGN or from online news articles.  It’s instantaneous and updated for everyone to see at once.  And for gaming developers, even aspiring ones like me, it is an indispensable tool for keeping up with the latest development techniques or ideas.  Say, for example, you want to do something called “Ambient Occlusion” in Maya, a 3D program, but forgot what settings to use.  Google an Ambient Occlusion tutorial or post to a forum of fellow game developers.  Or say you are looking for a new look for your characters in a game, so you go to ZBrush Central and look at all the amazing pieces they have on display for inspiration.  And let’s not forget about personal portfolio sites, the link to which you can send to a potential employer to check out your work.  Without the internet, the highly interconnected gaming field would be very isolated and getting ideas would be much more cumbersome.  It was that way before the internet came around, but now it is much more streamlined.

Columbia Pictures’ new movie, 2012, is connected to many forms of mass communication, both ancient and new.  The most obvious of these is the movie itself, a disaster movie shrouded in a popular myth, yet others are directly related as well, such as the Mayan Calendar itself, television, radio, newspapers, internet blogs, and print media.  Many of these media outlets either promote the movie directly or promote it indirectly by generating buzz around the controversy.

The focal point of this entire media storm is the Mayan Calendar itself.  An astronomical wonder, considering the difference in technology between their culture and our modern one (perhaps they weren’t as “primitive” as we are led to believe).  It is divided into units called baktun, the current of which will happen to end on December 21, 2012.  This end was a significant event for the ancient Maya, though they not once said it was a disaster of any sort; however, they did say it would mark a changing point for humanity.  This is a crux of modern discussions.  Modern scientists have begun to study the actual astronomical events which precipitate this change, the most significant of which is the alignment of the sun and the galactic center.  So the Mayans knew their stuff.

And now, with our age of mass communication, the whole world can partake in this frenzy and put in their two cents.  Supposedly, the world might end on that day.  People don’t know what to believe, or they panic, or they become skeptical, or they make movies to capitalize on others’ fears.  Television and internet are huge in this regard.  With the movie generating such buzz, television talk shows are suddenly discussing whether the world will end or not, and people are posting videos to YouTube telling everyone that they think it’s all a big hoax.  Morning radio DJ’s are talking about it, too.  It’s everywhere, and it’ll only snowball as the dreaded date gets closer.  Posters spring up advertising an aircraft carrier smashing the White House.  A random guy might pop up on the street corner downtown and preach to the world that they’re all going to die and that the Lord saved him.  People’s obsession with this sort of subject will perpetuate the situation until, at last, December 21, 2012, comes and goes.  And that’s what I think it’ll do.

While I do give much credit to the ancient Maya and their fantastic astronomical achievements, and to their prophecy that something will change, I think that by and large the whole thing is a big distraction, a diversion to keep people from discovering a higher level of consciousness.  Thus, this movie is but a pawn in that sequence.  There is enough information out there that the general population knows something’s up, but how they perceive it and feel about it affects what happens.  Control people’s perceptions and you control them.  So, a certain group of unnamed people who know about what’s going on will want to stir the general pot.  They engineer the release of a movie like this.  They whip up hysteria.  Oh my God, what’s going to happen?  A section of the population will completely buy in to the belief that humanity will change for the better.  Another section will become even more religiously fanatic than they already are and proclaim that the world is going to end.  Most of the people out there will say, “That’s a load of B.S.”  And when the date comes and goes with no real effect, just like Y2K did, everyone will move on and shut their minds off to the whole thing.  “Oh, it was a hoax after all.”  “See?  I told you.  Now hand me the remote.”  And they will have been effectively controlled all the more.  Get people to believe that actual significant energetic events like this are all a crock and say goodbye to any higher form of thought.

And then the Mayan 2012 phenomenon will go down in history as just another one of those hokey beliefs that is clearly “not” scientific.  The Mayan people, who already are annoyed at the misconceptions this media buzz is creating, will have to shrug and move on, also believing that their own calendar is just another oddly accurate scientific tool with no other meanings.  This, to me, is sad but inevitable.  And this so-called “cinematic event” will only help to fuel people in their misguided views.  Shame.  The action might be good, though.  Let’s all go and see it, and be enlightened.

Sources:  http://www.usatoday.com/tech/science/2007-03-27-maya-2012_n.htm

http://www.history.com/content/armageddon

http://www.archaeoastronomy.com/

http://www.sonypictures.com/movies/2012/

Sharing a Design of Mine

October 16, 2009

This is a picture of a 3D Torii gate I made, part of a level for an Intro to Level Design class.  It is a blended shot – final look on the left, fading into the black and white geometry shot on the right.toriiCam

Brian Rejack’s article about video games’ relationship with historical accuracy was an informing read, discussing some problems about which I as well had previously thought.  For the article, Mr. Rejack used the game Brothers in Arms:  Road to Hill 30 as a case study.  His basic premise was that, while video games may use convincing narrative and visually accurate virtual worlds to simulate reenactment of historical events, they ultimately fail when compared to real reenactments because of the detachment involved in playing them.  This is something with which I can both identify and disagree.

The article begins with a look at an advertisement for a 2006 History Channel [then] new WWII documentary series, Dogfights.  Rejack examines how the documentary makers use virtual graphics to emphasize its excitement and accessibility to viewers, which leads him to Brothers In Arms, a 2005 video game set in World War II, during and after the D-Day invasion.  He goes on to talk about the game’s narrative, from a soldier’s point of view, and its virtual worlds, all modeled very accurately; however, he next asserts that these visual and storytelling techniques can only carry historical authenticity so far because “The player’s sole opportunity for interacting with the other characters occurs during a firefight, something that keeps the emotional register permanently ratcheted to fear” (Rejack 2007).  This is very true.  In order to experience the emotional engagement and understanding a reenactment brings, the participant must become involved.  During a typical reenactment the actors must coordinate with each other while experiencing the physical connection with the place, things a video game can never convey.  The playing experience is usually solitary and focused on gameplay, with the connection experienced only in brief pre-designed cutscenes.  Is the game fun, or is it accurate?  This self-contradicting trait is inherent in the designing of historical games, and it takes away from the credibility of the game as historically authentic.  Even with unlocked extras that shed more light on the historical side of the game, the player is still limited in their historical understanding because they are focused on shooting enemies instead of connecting with the event going on.

After discussing these limitations, Rejack points to another game which presented interaction in a different way, Façade.  In it, the player moves through the game by choosing different interactions with two virtual characters.  He says that if Brothers In Arms could present the same visual detail, but mixed with Façade‘s method of story progress, Brothers in Arms could engage the player much more effectively.  This way, the game brings the player to a more personal level by having them talk and interact with virtual people at the virtual historical scene, and the responses they receive could help them understand the event more powerfully.  This point is something on which I totally agree, as someone who wants to go into the gaming field.  I have looked at other games, such as Mass Effect (2007), and have found that this level of control over dialogue and therefore events does indeed heighten the player’s experience and connection to the virtual world and characters.  Narrative presentation is something which is constantly evolving in the video game field, and Rejack makes a good point about how static, cutscene-based storytelling can detract from the overall experience.

And yet, while I agree with the article’s points about the separation of historical understanding and the game’s structure, I disagree that the game’s use as a tool for understanding is as limited as he says it is.  In my view, it is still far better for a person to see and hear a virtual representation of the event than to simply read or hear about it.  If they can be engaged as fully as one is when playing a game, all the better.  More character and situational development simply needs to be thrown in to the action mix in order for it to succeed.

Source:

Rejack, B. (2007). Toward a virtual reenactment of history: Video games and the recreation of the past. Rethinking History, 11(3), 411-425. http://168.156.198.98:2136, doi:10.1080/13642520701353652

I believe the article is credible for the following reasons:  Rejack has an extensive bibliography on where he drew his ideas, both third-party and firsthand; his article was published in a peer-reviewed journal; all the information presented is completely objective and free of bias; and the article is still current, having been published in 2007.

A Tail of Two Dogs

October 5, 2009

Han padded into the downstairs family room, searching for his sister, Mitsu.  As usual, she lay in the center of the room, flat on her back, as their human, Duncan, typed away on his computer in front of her.  Han’s miniature dachshund body did not have far to go as he plopped down next to his sister and yawned.

“Hey.  ‘Sup?”

“Not much.  I licked Duncan’s foot earlier until he started laughing and shooed me away, but otherwise nothing.”

“Oh.”  Han looked around.  Duncan was sitting in his big black chair and moving his fingers over some clickity-thing.  The little dog tilted his head.  “Hmm.  What do you think he’s doing up there?”

Mitsu looked at their human without turning her neck.  Her eyes were comically showing a lot of white as they stretched to see.  “Dunno,” she replied.  “Probably has something to do with communicating with other humans.”

Han was puzzled.  “Why do you say that?”

Mitsu looked back at her brother.  “That’s what humans are always doing.  Mass communicating.”

Han again was at a loss for words.  “What’s that?” he asked.

Mitsu rolled over to face him.  “Mass communication.  Humans are weird.  It’s a process of communicating ideas or messages to a group of other humans, using visual techniques like waving those pretty flags around or a broadcast through those glowing boxes called Tee-Vee’s.  Or, it could be through aural techniques like horn blowing or when one of the humans gets up and speaks to a bunch of other ones.”

Han lay his head down, pondering such strange words as “broadcast” and “aural.”  “I think I get it,” he mumbled.  “But how do they get those messages out?  I could never bark as far as they seem to go.”

Mitsu licked her nose before continuing.  “It’s called mass media.  They’re channels through which messages are sent to groups of humans.  It could be music or a Tee-Vee news human talking on one of those networks.  You know that one song that Duncan loves so much?  I think it’s his favorite one.”

“The Cinema Show.  Yeah, by that band Genesis.  He loves that one.  And, and,”  Han replied, working himself up with this flash of insight, “that thing on the clickity-box!  The Onion News Network!”

“Mmm-hmm.  He always laughs at those guys.”

The two dogs lay still for a moment, the intense brain activity having exhausted them for the time being.  Han broke the silence.  “How did you know all that stuff?  That’s human stuff.”

Mitsu rolled on to her stomach and sniffed the carpet, then proceeded to lick it.  “I dunno,” she said between licks, “I just pick it up since I always follow Duncan around.”

“Hey!  Mitsu, stop licking the carpet,” came a booming voice from above.  The miniature dachshunds snapped to attention.  Duncan had paused his clicking and was looking down at the little dog.  She reluctantly stopped and flopped on to her side.  As he turned back to what he was doing, Han commented,

“Man, I love humans but you’re right, they’re weird.  They can communicate with each other over such long distances through all those different channels, but they can’t understand why we have to do what we do.”

“Yeah.  Oh well,”  his sister replied, closing her eyes for a nap.  Han sighed and got up, going off to sniff for crumbs in the kitchen.