The Daredevil

Note on the MCU

If you’ve spent more than a few minutes in a conversation with me, you are likely to have heard something along the lines of: “There is no better time to be living as a comic book fan.” Nowadays, I cannot be more certain of this fact; the ageless rivalry between DC and Marvel is taking Hollywood by storm, every single hero – at least the ones that matter – are being picked up by networks and studios left, right, and centre, meanwhile I’m sitting here with a barrel of popcorn watching it all unfold. Although the approach taken by each of the two comic book giants is entirely unique, thus making a comparison slightly irrelevant, Marvel seems to have the whole interconnected story lines thing figured out. I do not intend on delving far into the realms of discourse regarding Marvel vs DC, (That is a topic for another time) both of them are companies with their own distinct way of operating – whether it be for print or on the silver screen – however I am definitely going to bring up the Marvel Cinematic Universe. This wouldn’t be the first time I’ve mentioned the MCU, and – with the way it’s looking – it probably won’t be the last. The first thing I will tell you about the MCU is that it isn’t an easy venture. The plan was to collect super heroes from all over the Marvel mythos (only those that Disney owns the rights to, though), figure out a way to connect them through an overarching story, stay true to the comic book versions of the heroes, all the while preserving the production quality across different mediums. When it comes to Daredevil, it pretty much checks out on all of the above, although there wasn’t an overwhelming amount of tie-ins to the Marvel Cinematic Universe, there were definitely enough of them to give you a sense of the implied interconnectivity.

The Review

Daredevil is one of the first installments in a series of Marvel TV productions to be showcased on California-based streaming service Netflix. It wasn’t the most pleasant of experiences when I heard tidings of Daredevil being touted for a TV show ; the reason being that I was assailed by flashbacks from the monumentally bad Ben Affleck-led Hollywood adaptation, forcing me to break a sweat in an attempt to keep my lunch down. Thankfully, the nausea abated almost instantaneously when I learned that among the showrunners was Drew Goddard, who worked with Avengers: Age of Ultron writer/director Joss Whedon on a multitude of productions. I was confident that the show was in good hands.

defenders-banner-3

A.K.A. Jessica Jones, Luke Cage, and Iron Fist are all titles scheduled to appear on the network throughout this year and the next, with an undoubtedly magnificent crossover miniseries called Defenders to act as the capstone for the Netflix-exclusive line-up.

Beware: spoilers ahead

The series opens to a rather typical exposé of Daredevil’s origin story, much like its nausea-inducing predecessor did. This parallel was not entirely unwelcome as the opening scene of the movie wasn’t horrendous. Thankfully, the show does settle into a darker tone shortly after the opening sequence when the rest of the supporting cast is systematically introduced to the viewers, starting with a corpse, and another one. In case you didn’t catch that, that’s a reference to the astounding emotional range of Deborah Ann Woll as the enamored secretary Karen Page, although to be perfectly honest, I am not sure if her natural acting ability – or lack thereof – is to blame or the rather the poor dialogue she has. (To be fair, she does pull off the “startled deer” face better than anyone I’ve ever seen) Karen Page isn’t the only kink in the chain; one of the biggest problems I have with the show is how poorly written the female roles are. With the exception of the Rosario Dawson cameo, every single female character on the show felt lackluster in comparison to the male counterparts. Even the gorgeous Ayelet Zurer’s portrayal of Vanessa -who has a pivotal role in the story and is a major source of character development – left me wanting more. It may not entirely be Goddard’s or Zurer’s fault. It may just be the case that I simply could not notice her character over how awesome Vincent D’Onofrio’s Wilson Fisk was. It’s been a while since I was legitimately in awe of a character on any screen; the last time I felt this enthralled was watching Gustavo Fring (Giancarlo Esposito) on Breaking Bad. I could write a sonnet and a half illustrating just how fantastic I thought Wilson Fisk was; throughout the entirety of the show, there wasn’t a dull moment involving Wilson Fisk, the fact that I’m able to say that given that Daredevil is essentially a 13-hour long movie is a testament to how well the character is written and depicted. (I know I said that Marvel vs. DC wouldn’t be brought up, but I guess I lied) There were some plot line directions that were taken by the showrunners that I definitely disagree with, most of them involve killing off characters that are somewhat important within the MCU as a whole; Ben Urich and Leland Owlsley come to mind. They both enjoy a relatively important role in the Marvel Universe within the comic books, and though the deaths serve to develop my favourite character on the show, it is unfortunate that they have been forever removed from the MCU. The other decision I disagree with was taking away from the comic book Matt Murdock by completely stifling his playboy personality on the show, I understand that you were going for dark and gritty, but is it too much to ask to display a little bit of that suave personality we know so well? The Marvel Cinematic Universe has been lacking a captivating villain to stimulate the other end of the morality spectrum, this is something DC has almost always had, whether it be Heath Ledger’s chaotic Joker, Jim Carrey’s psychotic Riddler, or Jesse Eisenberg’s upcoming Lex Luthor. In comparison Ultron was incredibly bland, Red Skull is nowhere to be seen, most of HYDRA is gone, Von Doom and Magneto belong to FOX, and Loki is just an insane demi-god who eventually bores you with his childlike antics. However, times have changed; Wilson Fisk has given the MCU something it has needed for the longest of times, a villain that just baffles you. It may be a little too hasty to call him the Joker of the MCU but that claim definitely isn’t too far off. I hope. Characters aside, the plot is a rather benign superhero vs. supervillain story with a few memorable scenes dotted throughout. That’s not to discredit just how memorable they were, taking the the fight scenes as an example. Daredevil’s fight scenes are some of the most entertaining I have seen in a long time, better than any other fight scenes in superhero/comic book inspired shows currently on TV. In particular, the one-take hallway scene in the end of the second episode elicited a lot of respect for the show; I was basically hooked after that. Add to those scenes with some haunting Tarantino-esque violence involving Wilson Fisk and you have yourself some excellent television. I would definitely recommend you take a day off work to binge the 13 hours, as it was meant to be watched.

The role of marketing

Preface v2

This was my submission to an International Marketing midterm assignment. I was convinced by my writer’s alter ego to x-post this onto wordpress and dust off the metaphorical bookshelf. I took a rather unconventional approach to writing this assignment and decided to do away with academic prose and run rampant with ignominious, ignoble, and incongruous allegories.

This was how I implored my professor to display his merciful side:

I chose to interpret the assignment as befitting a more liberal writing prose: similar in severity to a formal blog post about Spanish Civil War reenactment, but not as serious as a review of New York’s finest patisseries on Travel and Leisure. I hope that my sardonic (read unfunny) literary style does not infuriate you to the point of chucking your laptop outside the window, or worse yet, deciding to fail this paper. But in the event that you do, I understand, and I hope you’re insured.

 

  1. Marketing: What it’s not

There is a rather ubiquitous misconception that the central role of marketing, or rather its raison d’être, is confined to providing jobs for business degree-holders that are terrible at mathematics. While the sentiment in the statement is rather contemptuous, anyone who has ever studied derivative pricing in the pursuit of their MBA will likely find themselves agreeing with it. Whether that statement reflects the scornful temperament of finance majors or bears a ring of truth regarding marketing is what remains to be seen. Marketing is also often, erroneously, confused with advertising by most business people; whereby the marketing department consists of nothing more than glorified, numerically-illiterate salespeople who are paid much more than they should be.

What if I told you that one of the greatest responsibilities of a marketer is to analyze hundreds upon hundreds of historical and current data sets, consisting of thousands upon thousands of observations, which have millions upon millions of variables? All (in an attempt) to find out whether customers prefer to wear a V-neck, polo, or crew top when going shopping. Sounds quite mathematical, right?  Also, rather pointless, maybe?

“Marketing wants to be the key that unlocks the doors to customers’ wallets.”

  • Khaled Choucri (2016)

Admittedly, I am unlikely to be famous enough for a proper quote, how about this:

“If you can’t explain it to a 6-year old, you don’t know it yourself.”

  • Albert Einstein (1940s)

Poignant statement, Herr Einstein. Let’s hope I know what I’m doing and that no one under six is reading this.

  1. What is a Brand anyway?

Nowadays, businesses are scrambling about trying to capitalize on the rapid advancement of technology, particularly with regards to communication innovation. We’re talking about the big SM: Social Media. Your mom’s on it, your dad’s on it, your dog’s possibly on it as well, and that’s just it: everyone is connected, and that includes businesses that are trying to sell stuff to you.

How does a business use social media, though? You don’t exactly see Bob from Accounting posting a funny cat picture on his wall while simultaneously educating his friends on the horrendously low premiums for your life insurance products. Not to discredit the hilarity of funny cat pictures, but no, you don’t. Businesses use the concept of a ‘brand’ to interact with their customers on platforms such as social media.

Bob from Accounting (Taken from shutterstock under free use license)

A brand can be thought of as a persona that exemplifies the values, goals, and objectives of a company. Brands can also be conceived (from the customers point of view) as the collective perception that customers have of the ‘personality’ of the firm and/or the products associated with a firm. For example, typical Apple-users view Microsoft as this pointlessly complicated and technologically-inferior product, while Microsoft-users often confuse Apple products with educational toys for toddlers.

Now why should a company care about its brand? In order to answer that, I’m going to ask you a simple question: Are you more likely to give money to someone you like or someone you don’t know? Concise enough, I hope. Moving swiftly on.

The problem with brand management is that sometimes, businesses come off as ‘trying too hard’ to impress its customer base. This can be rather off-putting, kind of like the guy you know who always tells the same three edgy jokes about something wholly inappropriate, given the situation; it might be funny the first few times but it gets old quite quickly. Ideally, companies should seek to manage their brand effectively to nurture the relationship that they have with their customer base.

  1. Market research: the sorrows of a business student that disliked mathematics

You thought you could study business without ever having to learn math, didn’t you? Or at least hoping you’d never encounter it after that one statistics course that everyone failed and repeated in the subsequent summer.

I truly am sorry for your loss.

Market research boils down simply to a series of questions that we ask the market with a certain objective in mind. Questions like: “When do customers buy our product?”, “Why do customers buy our product?”, “what do customers want our product to be?”, “Are customers happy with our product?”, so on and so forth. That last one is particularly tricky. The objective in mind being to use this data to help the company make an informed decision on the marketing of its products.

The fact that the answers to those questions usually take several days of intensely staring at a 300-page spreadsheet over a bottle of wine or two, classifying and sorting through hundreds of questionnaires, surveys, and online polls, wallowing with existential dread over the fact that you cannot possibly ask for another extension on the deadline, all the while debating how much you really need this job, should in no way discourage you from pursuing a career in market research.

It is unsurprising that most firms choose to outsource their market research to dedicated firms like Nielsen and Reuters. On the other hand, how masochistic do you have to be to work at one of those companies?

In a nutshell, one can say that the goal of market research is to aid a firm in its decision-making processes involving the market while also empowering the firm’s customers to be able to influence these decisions in a way that better addresses their needs and wants.

  1. Product design, development, and pricing

The archetypal process that governs a product’s existence in the market is referred to as the product life cycle. Marketers are involved with this process from its inception to its dying breath and are largely responsible for many of the products we utilize in our daily lives. The product design phase of this life cycle incorporates the analysis from previously conducted market research studies depending on the history of the product in question; i.e. the company surveys its customer base on any possible recommendations on or gaps within the current product’s performance, in the case of an already established product, or it tries to figure out if there is an opportunity to release a new product in the market  to fill an existing void and to, somehow, gauge this product’s success before the fact, to help the company decide if the venture is worth it or not.

Marketers have many tools at their disposal to aid the company in its product development phase; things like focus group studies, customer satisfaction surveys, and industry-wide reviewing platforms (such as CNet for software products, Tom’s hardware for electronics, etc.). Marketers also take advantage of proven patterns in consumer behavior to better design and develop products, for example: People tend to buy more ice cream in the summer, that is because the weather is hot and it helps them cool off. Therefore, a cold product will sell more in the summer, usually.

Arguably the most important decision that marketers contribute to is the product’s price point. The price of a good/service affects our decision to buy it with nearly all goods and services in the market. There are only a few products for which consumers are price indifferent, and this price indifference can be shown to be a function of personal income or wealth. That’s why billionaires often dine at the most expensive restaurants where the marketers can get away with charging $10 for a glass of ice water.

  1. Promotion: The glorified salesman

Last but certainly least, a marketer’s role is also to advertise a product. After all, who’s going to buy your product if no one knows about it, right? Hundreds of hours of creative brainstorming sessions undoubtedly went into the powerful slogan: “Got Milk?” This slogan merited a national penetration percentage of greater than 90%; in normal English, this means that more than 90% of people in the United States have seen this slogan in one form or another. That is massively successful for an advertisement campaign.

The goal of an advertisement campaign is to raise awareness and disseminate information for a company’s product. Through this campaign, the business endeavors to generate demand for its product by publicizing the benefits and value of their brand.  During an ad campaign, businesses also often take the time to differentiate themselves from competitors or, more often, to straight-up denigrate other brands.

Promotion, in a perfect world, should make the customer feel like they are getting far more value than what they are paying for. While the reality of the situation is frequently far from that, the customers (usually) don’t know any better. Conversely, terrible advertising often spells the end for a brand or sometimes even the entire company. Look no further than Research in Motion (no, they still haven’t changed their name to Blackberry) for confirmation.

  1. Ad Finitum

I hope that my paper has firmly established the role of Marketing within the organization. To advocate for equal opportunity, I will catalogue a series of quotes from various people (mostly myself) whom I enquired as to their opinions on the role of marketing in an organization:

The interface (or lubricant) to connect seller and buyer, producer and consumer, business and customer.

A quantitative and qualitative method to evaluate and affect the performance of a product in the market or the estimated/expected performance of a future product.

An effort to examine the behavior of markets and identify any patterns that can be exploited to reap greater benefits.

Just like mathematics is an attempt to understand the natural world, Marketing is an attempt to understand the market.

Maintains and cultivates the image of the firm, also known as brand, within the market.

In the hand of the marketer it helps them identify needs and how best to satisfy them. In the hands of a customer it helps them choose what gives them the best value for what they pay.

  • Michael Samy (2016)

The more you are aware of customer wants and needs, the better Goods and services you can provide, the better the firm will perform. Marketing makes you aware of those needs.

  • Mai Hindawi (2016)

It nurtures the customer’s perception of value for the goods/services that the firm provides.

  • Adham Shebl (2016)

Emotional Marketing

I first encountered the KONY2012 phenomenon with the #InvisibleChildren hashtag on BBM; almost all my friends had it proudly displayed as their status, no doubt a powerful reflection of teenage hyper-activism. It was one of the largest social media campaigns of its time, and this is coming from someone who experienced the full brunt of the Egyptian revolution firsthand. People everywhere were whispering (and sometimes yelling) the name: “Kony”. I was having none of it, at first, but I eventually succumbed to the overwhelming torrent and decided to watch the YouTube video explaining just what the big deal was.

After watching the video, which was clearly intended for this specific purpose, I felt quite a stream of emotions. It didn’t really click with me why I was feeling so enthralled and invested in the cause simply by watching a 30 minute pseudo-documentary. For whatever reason I decided to ignore it. Until I was assigned to read “The Science of Emotion in Marketing” by Courtney Seiter.

It was really quite surprising just how significant of a role emotions play in marketing campaigns. The article went on to demystify the mechanisms behind viral occurrences on the internet, most of which are connected through one way or another to the emotional spectrum we experience on a day to day basis.

The reason behind any video going viral is its ability to invoke a compulsion to share it amongst your social circle, this compulsion – according to the article – is based on one of the most innocent of emotions: happiness. It is easy to see how the uplifting start to the video does a solid job putting the viewer in an uplifting mood, with a focus being on the inception of the Invisible Children organization and its noble goal.

The next emotion targeted was one that helps create a sense of empathy and connection: Sadness. It is a relatively commonplace event when a tragic experience helps forge a bond between those connected by the entailed sadness. The director really hits a home run on this part, with a hauntingly captivating sequence of photos from the havoc and destruction wrought upon the Ugandan Peoples by the Lord’s Resistance Army (LRA), culminating in a teary plea by a young boy to end the suffering.

Fear also plays a role in propelling the video to viral proportions. The reason being that fear incites feelings of desperation and anxiety that often cause viewers to develop an attachment to the cause.

What better way to instill fear into viewers’ hearts than by showcasing the harsh facts outlining the scale of tyranny imposed by the LRA?

Being one of the cardinal sins should give you an indication as to just how powerful anger can be as a response to something. Nothing quite causes you to dig your feet in and stand your ground more than unannounced aggression or infuriating news. Both elements are unsurprisingly utilized within the KONY2012 campaign video; brought about by the aforementioned fear, viewers feel the desire to overcome the despair and fight against the oppressors (all the while yelling “vive la revolution!” from their windowsills), which translates somehow into one of the most basic animal responses: anger.

In conclusion, emotions play a paramount role in the marketing of brands and content to consumers, and if your product contains just the right amount of emotional orchestration, it is more likely to impact a significantly larger audience.

Community

I’ve mentioned before that the internet never ceases to amaze me on a regular basis, and it only amazes me because of what humans can do with it. The existence of a digital parallel to our world wherein every single individual with access to it is connected allows for the impossible to happen, and sometimes the impossible is the necessary. I’d like to argue that the internet makes the world a better place. That may not be an entirely unbiased statement .

Rather than follow suit on my previous posts, I’ll begin with a personal recount of a story quite relevant to the subject at hand. It was about four years ago (I wasn’t even in college then) that I came across my first kickstarter project; It was a campaign to raise money for a Youtube Webseries called Video Game High School.

At first I was hesitant to put any amount of money into something that might not even happen, despite claims that the money would be refunded should the project fail to meet its goal. (16 year-old me was expectantly doubtful of online banking) However, after a brief look through the incentives, which are really just different emotional/financial investment options designed to simulate some form of personal attachment to the product you are funding. Such as having your name placed on the Executive Producers list of the webseries. Untitled123

It took a few hours, and my sizable donation of $5, for the project to reach its goal of $75,000. From then onwards, the donations piled on. Eventually reaching $273,725 in 30 days from 5,661 donors.

Statistics aside, the lesson to take away from this is that crowdsourcing works, and it works really well. It is a natural instinct for man to want to leave a mark; nothing quite lives up to the sheer motivation spurred up by the desire to impact the environment and the lives of those around you. The collective sense of investment invokes feelings of commitment and responsibility towards the project, this causes donors to advocate for the cause and promote it amongst their circles. It’s easy to see the benefits of crowdsourcing.

Sometimes it isn’t so easy to see otherwise. In this dense article, a columnist for Truth-out.org pens down some rather scathing remarks regarding the validity and honesty of a modern age charity by the name of Charity: Water. The charity itself is innovative in its delivery and branding, and places emphasis on targeting the youth. Despite the charity’s lack of a marketing program, it still manages to spread like wildfire, that is because it utilizes crowdsourcing to fill in the gaps left out by their conservative budget.

ice-bucket-challenge

High profile donors often make promotional videos to showcase the endorsed foundation/charity on social media. some times even without the company orchestrating them; a notable example being the viral “Ice Bucket Challenge” videos.

Though many obstacles faced by charities in the past have been somewhat tackled by crowdfunding initiatives, less-than-truthful organizations/people still exist, and they to can make use of sites such as Kiva and others to raise money for their questionable cause. Thus, a discerning eye is still a potential donor’s most valuable asset in deciding whether or not to empty out your coinpurse.

The Art of Critique

An age-old saying goes: “Be your own worst critic”, but in this day and age where the internet has allowed virtually anyone to comment on your profile just to say how unflattering he/she thinks you look in that outfit, that advice can get a little unnecessary. Whether you are a university student or a big time Hollywood exec, the opinions of your audience directly impact your chances at a successful and meaningful life.

Since so many livelihoods are affected by the professionally (and mostly unprofessionally) written opinion pieces, there is, without a doubt, a great level of import attached to the rhetoric used within the platform of a critique. As, we’ve seen before, a mere 140 characters posted on the internet is enough to ruin your life, so some rules exist for the benefit of all. Here, I will hazard an attempt at explaining and outlining some habits to incorporate into your life if you plan on utilizing the powerful medium of the review. Also, I will use this opportunity to shamelessly plug my own recently published review. Moving swiftly on. I hate lists.

Now let me completely ignore my previous statement and start out with something somewhat related: 3 things an initiate to the review community should avoid like the plague.

  1. Trolling. I pray that the Lord have mercy upon the unfortunate souls that encounter the not-so-fabled internet troll. This is most definitely the number one thing you want to avoid if you want to be taken seriously, unless of course you are a master troll. Several articles outline why exactly it is that trolls make the internet a worse off place, but the general idea is that they often serve to distract the audience from the main point through insults, nonsensical information, random babble, and off-topic meanderings.
  2. Taking things too personally. The nature of a review is undoubtedly personal, and since you are revealing your own insight regarding a particular experience it is quite easy to get carried away. Depending on your writing style and rhetoric, some language can serve detrimental to your credibility and the quality of your review, particularly language of an explicit nature. Of course, it all depends on how you choose to frame it.
  3. Libel. This is probably one of the worst things that can be contained within a critical piece. It is a defined as a personal attack on someone through written work – much akin to slander. Unfortunately, the brunt of hurtful comments are often directed towards professors and university faculty on student rating websites such as RateMyProfessor.com. Remember, while your history professor may be terrible at explaining the outcomes of the German unification, that doesn’t give you a carte blanche to discuss his/her poor taste in cardigans. Also, don’t forget that your professors can fight back.

This list is relatively short, but almost entirely exhaustive, some of the best practices of a review depend on the exact product or experience you are reviewing, but a general word of advice is: “Don’t write anything you wouldn’t want your mother to see.”.

End Note: If you are in the mood for a laugh, check out this Amazon customer review of Haribo Golden Gummy Bears.

Avengers: Age of Ultron

Introduction to the MCU

Ah, the Avengers; the fruit of several years of labor and countless machinations to adapt the timeless comic book series to the big screen. Though this is only the second installment in the Avengers series, it feels as though it is the end of a proverbial era; an era where superhero movies were not connected by an intricate and sometimes overwhelming mythos, such as the Marvel Cinematic Universe (MCU).

Age of Ultron (AoU) is the penultimate installment in Disney-owned Marvel Studio’s so-called “Phase 2” of the MCU, to which Ant-Man will have the honor of being the closing act – which is tentatively set for a mid-July premiere. The extensive timeline of the MCU is designed to immerse viewers into this world of cosmic superheroes and planet-rending villains across multiple film franchises painstakingly collected by Disney’s herculean effort in the previous two decades, with the notable exclusion of the X-Men and the Fantastic Four franchises, both of which are owned by Fox studios. The latest franchise to be amalgamated into the MCU is our favorite web-slinging hero of New York (who was unexpectedly absent during the large-scale destruction wrought by a very large angry green man – we’ll talk more on this later).

If you’re completely lost, or struggle to remember some of the finer details then I suggest watching this video, it sums up the storyline of the MCU so far.

SPOILER WARNING: This review contains a multitude of spoilers for Avengers:Age of Ultron as well as many other Marvel Studios movies. If you have not seen the movie, I highly suggest you do so beforehand. Additionally, I do not claim ownership of any of the MCU movies or its characters or any of the images used below.

The Review

The movie opens immediately into the fray of battle; I personally felt like someone hit the fast forward button on the projector instead of play – anyhow, moving swiftly on. Right away, the experience that the Avengers have gleaned throughout their previous escapades showcases itself in a visually impressive multi-scene fight sequence –which even included a fantastic comic book-esque Team Avengers shot – throughout which the Avengers’ teamwork is at an all-time high.

Untitled

Avengers Assemble!

The shining moment of the first act was when we were introduced to the next additions to the Avengers squad: The Sokovian-born Maximoff twins A.K.A. Scarlet Witch and Quicksilver (who were momentarily sojourning in villain-town). I believe their casting was spot-on, but Quicksilver could have had more dialogue than the sluice of quips and one-liners we were subjected to. Compare that to Days of Future Past’s Quicksilver, played by Evan Peters, who was more powerful, funnier, and better written, despite being less true to the comic books. Whereas Aaron Taylor Johnson’s Quicksilver felt hollow and difficult to relate to, the situation is entirely different with his twin sister the Scarlet Witch. Wanda Maximoff was a beautifully written character, portrayed to perfection by Elizabeth Olsen – I will add, with enough dialogue to allow for introspection into her deeply troubled psyche, while also preserving more or less the same amount of action as her lightning fast brother.

Now here comes the bombshell: James Spaders’ performance of Ultron was sometimes underwhelming. Yes, I said it. He was unable to incite enough fear within the Avengers (Scarlet Witch was awfully good at that though), he had an astonishing lack of reference-free dialogue, and despite his apocalyptic intentions it was hard to take him seriously. It didn’t really feel like it was the “Age of Ultron” I was promised.

Avengers_Age_of_Ultron_46

Spader-tron disappoints

The following defeat of the Avengers at the hand of Ultron and the Maximoffs caused the team to lose faith in themselves, which conveniently allowed us for a brief insight into some backstories which were previously left entirely ambiguous. I enjoyed the psychosomatic flashbacks (particularly Black Widow’s) almost as much as I enjoyed the oncoming Nick Fury cameo; who served the role of “coach whipping the team back into shape in time for a second wind on round two”.

In lieu of it being a superhero movie, the team (sans Thunder God Thor) naturally rebounds for another shot at the title. No surprises there.

Vision is by far, and through no possible means of comparison, my favorite character to be presented to the Marvel Cinematic Universe, surpassing even Robert Downey Jr.’s Iron Man. Paul Bettany was spotless in the execution of his role, his portrayal of the Mjolnir-wielding synthozoid Avenger The Vision was never off point; his introduction sequence was memorable, his dialogue was engaging and essential to the development of the plot, and he’s just really… really… cool…

Avengers-2-Age-of-Ultron-Vision-Poster

So cool

From then on, the movie just kept getting better; Vision’s impact was almost instantaneous and resounding. His part in fight scene was quite pivotal to the story. Speaking of the “fight” scene, how cool was the “defend the base” sequence near the end of the movie? The one stale moment was the needless death of Quicksilver, I say needless not because his death was forced, but because his presence will be missed from future Avengers flicks. Though to be honest, it is a Whedon movie; someone had to die. It just couldn’t be Hawkeye, that would be cliche.

End note: The very last line of dialogue was “Whedoned” quite well, though I don’t know about you, I still yelled out “Assemble!”, and I wasn’t alone.

Conclusion

Joss Whedon’s Avengers: Age of Ultron does a good job of telling the tale of the Avengers at their best, and their worst. Though some of the dialogue can be a little off-putting, particularly when it came to the antagonist, the majority of it was solid writing. Moreover, the new characters were beautifully introduced and portrayed, kudos to Whedon. All in all, the movie is a great stepping stone for the Marvel Cinematic Universe, and performs fairly as its own standalone motion picture.

Sensitivity

If you’re an avid forum-frequenter – such as myself – or one who enjoys watching other people’s lives ruined by the internet, then you’ve likely come across the fabled internet phenomenon known as “the witch-hunt”. It probably is a rather self-explanatory term but for the sake of brevity, allow me to make the following analogy: an internet witch-hunt is not unlike a real-life witch-hunt, but instead of pitchforks and torches being the angry mob’s weapon of choice, the modern day witch-hunter prefers tweets and strongly-worded emails.
In the past, those accused of being witches were likely to be put on trial (unfairly), appear in court (briefly), then sent off to be held in containment until the pyre is constructed (hastily). You will be very glad to know that humanity has held up this tradition rather zealously; the case study here is momentarily-famous-but-not-really-famous Justine Sacco – a former PR representative of a rather large US media company. Now, if you are somehow unaware of the events leading up to the addition of the “former” in my previous sentence all you really need to know is that Ms. Sacco made the horrible assumption that she could be racist on the internet and get away with it. Pssh, I know right?
All it took was one off-colour (pun intended) tweet for Ms. Sacco’s life to become a real life simulation of the Salem witch trials, updated to the 2014 version of course. She lost her job, was publicly executed (shamed), and lost the respect of her family, friends, coworkers, boss, and 15000 people on the internet.
This starling example of internet vigilantism truly highlights the power of horde mentality and the love of “justice” (because God help you if you do something stupid on the internet) that humanity oh so dearly enjoys.

“Going to Africa. Hope I don’t get AIDS. Just kidding. I’m white!”
– Justine Sacco (RIP)

The tweet that started it all.

Perception

On Perception

An apology in advance if you are quick to fume at viral internet idiosyncrasies. I do not mean for it to become a trend of mine to discuss the goings-on of the internet on a weekly basis; especially when the topic matter at hand is as drab and astounding as the sheer uproar generated from an out-of-place optical hoodwink that managed to proliferate interest to the point where the use of the word “proliferate” is justified. If you’ve managed to piece together the potential content of my blog post and are still here, then I welcome you; for you are of a far calmer disposition than many of my friends, if you are yet to understand what exactly you’re here to read about (may it be because you have somehow avoided the collateral damage from this internet nuke by locking yourself in a room with no WiFi) then allow me to spell it out for you: I’m talking about the stupid dress.

In my childhood I was quite obsessed when it came to optical illusions; I would print out as many of them as I could find — with the limited Google skills that 9-year-old me possessed — and hang them around my room. Can you imagine how worried my parents must have been? They’ve got to be relieved that I turned out mostly fine. This obsession dwindled to the nonexistent as my interest in the opposite sex catapulted skyward, as would be expected. I really had no intention of giving an iota of time towards the ‘next big thing’ on the internet; I tend to limit my use of the world wide web to intelligent discussion on various forums and countless hours spent Googling pictures of cats.

It was my belief that for one exhibit any inkling of fascination towards optical illusions you either had to be a twelve-year-old or a schizophrenic artist; but clearly, the internet has proved me wrong as on countless other occasions. Illusions have been around for much longer than anyone currently around on planet Earth, and if someone like Ephicarmus of Kos or Protagoras from 5th century BC  were to suddenly appear 2515 years into the future he would probably tell you that this whole dress thing is old news.

On the Philosophy of Perception

“The mind sees and the mind hears. The rest is blind and deaf.” – Ephicarmus of Kos circa ~500 BC

The very core of philosophy is to examine, and to examine you must first perceive.  Our Greek ancestors had a field day attempting to describe the mechanism governing human perception. They managed to raise more questions than proposed answers, but isn’t that the joy of science and exploration?

One of the questions catapulted to the frontier of philosophic inquiry is “Do we influence reality through perception? Or does the universe place us at the mercy of our senses?” Now, being the naturally inquisitive and adamant person that I am, I chose to ask my friends for their opinion – totally not a mistake. Regretful decisions or not, I learned quite a lot from most of the answers. (note the presence of “most”)

Putting remarks that cast a shadow of doubt on the mental prowess of the company I choose to keep, one of the most insightful answers came from a close friend who is actually studying philosophy. He told me that most philosophical arguments regarding perception are classified into one theory or the other; idealism, realism (indirect, or otherwise), skepticism, or any of a multitude of words ending in “ism”. Thankfully, I am not here to provide an epistemological comparison between each theory of perception – ancient, or modern – while listing the philosophers that pioneered them. I am merely here to pretend I know ever so slightly what it is that I’m talking about.

One of the theories – I neglect to recall which – argues that the entirety of the observable universe exists only within the minds of those that can observe it. in one way or another. The data that is contained within an object, such as color, shape, texture, weight, is just a way to stimulate the correct areas of the brain to experience that object in a way such that our brain can comprehend and associate that information to other preconceived knowledge. This theory plays well with that of relativism; ergo everything we observe is always described in our head relative to something else we already know of/have seen.

My personal belief is that the mind is separate from the universe, in essence: the universe does not exist within the mind. However, we are given tools that allow us to observe this universe vis-a-vis our senses, though our understanding of the world is limited to our capacity to process information, which may or may not be infinitely variable. This plays into the theory that each person sees things in a different light, as each person possesses a completely different set of conditions governing the information processing abilities of their minds.

Philosophy can be very confusing but it is at the heart of all things we do in our lives; though sometimes it may go a little too far. Such as the utter destruction of the concept of the self (Buddha would be proud) carried out by Sam Harris in this video.

On the Psychology of Perception

“The whole is other than the sum of the parts” -Kurt Kofkka

One of the intriguing answers I received regarding this presumed status quo of perception came from my own cousin who preferred to shun the philosophical aspect in favor of a purely psychological approach to the issue at hand; he referred me to an observed list of principles known to the medical world as “Gestalt principles”, pioneered by the aforequoted (not a word, by the way. I really wish it was though) German psychologist Kurt Kofkka.

These principles attempt to explain the contrived way human beings actually see anything; the way it does so is by theorizing several laws (I prefer to call them axioms; it sounds so much cooler) governing the process of observation. Without further ado, they are:

  1. Similarity: Objects that are similar are automatically grouped together by our brains
  2. Pragnanz: Reality is reduced to the most simple forms conceivable
  3. Proximity: Objects that are in proximity to each other are grouped together
  4. Continuity: Lines are seen as following the smoothest path
  5. Closure: Objects grouped together are seen as a whole

The final rule is probably the most influential when it comes to observation, through the fifth principle we can explain the ability of our brains to “see” negative space images. Though this has little to do with the actual dress phenomenon, you probably realized that I used it as a segue to talk about something that is actually interesting.

(The word pragnanz is a German term meaning “good figure.” Knowing this, I will now probably use that to flirt with psychologists)

I call him Gestalt.

Closing Remark

I chanced upon an advert strewn across my Facebook feed that is part of a domestic violence/abuse campaign (to prevent it obviously). The ad makes a rather clever use of the dress in this context, however I would like to take a moment of your time to remind the world that violence and abuse are not exclusive to one gender; and if you are in a relationship that is abusive in any way, shape, or form, then I implore you: Please do not be ashamed to ask for help.

The Blog: A modern discussion

On the Origins of Art

Ah, self-expression… what other pursuit has driven humanity to the insane, the incredible, the unlikely, and the downright stupid? Throughout every imaginable point in history, mankind has endeavored in one way or another to make their inner feelings and musings known to the outside world. This extremely basic desire to unleash upon the world the harnessed emotional energy that storms just beneath the surface of our meandering husks has brought upon us the genesis of the world’s most wonderful phenomenon: Art. From the early papyrus scriptures of ancient Egypt, to the stark raving mad internet antics of politically-conscious 14-year-olds on tumblr©, writing has always been a powerful medium of exchange for the human race. Another force to be reckoned with is that of modernization; the very reason I’m currently tap-tap-tapping away at my keyboard. These two vectors intersect to produce a multitude of avenues for self-expression; the blog included.

In a rather long and particularly academic paper titled “Blogging as a Social Action: A Genre Analysis of the Weblog”, Carolyn Miller and Dawn Shepherd of North Carolina State University have managed to condense a wealth of revelational (that is a word by the way) information regarding the origins and the discipline of the internet log; the most groundbreaking, earth-shattering of revelations being that the origins of the word “Blog” is from an abbreviation (amalgamation) of “WeB LOG”, how awesome is that?

On the Relevance of Blogs and the Destruction of Privacy

The paper undoubtedly discussed much more than the clever construction of the word “Blog”; what with it being a rigorously researched and painstakingly carried-out work of academic and literary art (they used the word “kairos” 15 times in one paper, that’s got to be a world-record of some sorts). Blogs are relevant – apparently -and thus the discourse regarding them is also relevant. Such a logical deduction serves as a backdrop to the events surrounding the rampant growth in the popularity of the blog in such a short span of time. The supposition highlighted in the paper is that blogging chanced upon a perfectly opportune moment in time to reveal itself to civilization, and thus it has catapulted itself into relevance in the modern world thanks to this planetary alignment of sorts.

Something that really strikes a chord with me in the paper was the way it describes the haunting desire that humans possess; the desire to invade the privacy of people – strangers or otherwise. The tongue-in-cheek choice of words that resonated with me was ‘mediated voyeurism’. Some not-so-recent highly publicized scandals such as the Clinton-Lewinski affair, and the tragic end to Princess Diana’s life have underscored the very human need to know things that they don’t necessarily have the right to, and the succinct choice of words serves to illustrate the almost-pornographic nature of reading into others’ lives for personal entertainment. It’s also completely reasonable to assume that this addiction has brought to our lives the vile hell spawn known as reality TV – another reason to completely despise human nature.

On the Nature of Blogs

Like all art forms, the blog is stratified into various categories or genres. Also like all art forms, you will eventually run into some gray areas when it comes to defining what exactly it is you are looking at/listening to/experiencing, thankfully though, the paper does a good job of making sure you are completely aware of that fact. Someone by the name of Rebecca Blood reasoned that blogs should be categorized by the content of the blog and the writing style; quite a reasonable assumption if you ask me. She mentions that the blog is loosely split into two major varieties; the variety that seeks to provide a legitimate source of information to the reader is dubbed a “filter-type” blog – you may consider them akin to newspapers or books, the alternative variety that is inspired by the style of your sister’s secret diary – who just really wants to find a way to talk about her feelings without being judged – is known as the “personal weblog”. Further subdivisions of genre have been proposed by Jill Walker; whereby the classification of the blog can be drawn from the choice of medium and the presentation format. The audiolog, videolog, and photoblog (instragram does not qualify) are typical examples of the variation in  choice of expression.

Despite the lack of a universally acceptable definition of a blog, there seems to be a concession regarding the principal elements that any and all blogs should contain: (are you taking notes?)

  1. Reverse Chronology (like Facebook)
  2. Frequent Updating (also like Facebook)
  3. A Combination of External Links and Personal Commentary (……. kind of like Facebook)
  4. Pictures of Cats (Alright so it’s basically Facebook)

That last one is a joke, though not really. The internet is a strange place.

On the Purpose of Blogs

The intrinsic question being illustrated by the subtitle (in case you are not particularly gifted at reading between the lines) is: “Why?” Now, now, there is no need to suffer an existential crisis at my hands; however, since I am not a supercomputer capable of calculating the precise answer to the Ultimate Question of Life the Universe and Everything Else (which is 42, by the way), I will attempt to answer another question instead: “Does Blogging present a valid contribution to rhetoric?”

Self-expression is a salient theme among some bloggers, who find the same opportunity that television talk shows afford their participants: the opportunity to tell their stories in a mediated forum to a potentially large, though distant and invisible, audience.

– Miller and Shepherd, 2004

What makes Blogging so different than writing a newspaper article, or a magazine piece, or maybe even a novel or autobiography? The impressive quotation embedded above claims that one-way anonymity plays a significant role in the prevalence of blogs; however that doesn’t prevent authors from publishing their books under pseudonyms, or from asking the publisher to leave their names out. What a blog does provide, though, is a channel whereby someone may expose their private thoughts to a potentially countless audience, and yet retain a completely authentic “personal” or diary-like presentation. The same person who chooses to do that can choose instead to abandon the realm of the clandestine and make use of the internet as their own public activity log (I’m looking at you, Twitter). It is from this limbo of public and private that the blog harnesses the bulk of its strength. The juxtaposition fulfills a rare combination of purposes that no other medium of expression may be able to provide, at least for now. Or maybe forever, who really knows?

523081_10151001940516389_2037445263_n

Majestic, isn’t he?