Friday, August 31, 2012

Indiana Jones 4 hatred

The fourth Indiana Jones movie is universally hated. What is the most commonly cited reason for this?

Aliens. That's it. Aliens. Sure, there are many other annoyances and defects often cited as well, but the aliens are by far the most common element to all complaints. In fact, it's the first and often even the only relevant thing that reviewers and other people mention.

Ok, then there's the fridge, of course, which is probably a very close second most commonly cited reason. (In fact, citing the fridge as a reason for hating the movie is even more irrational than the aliens.)

Let's put this into perspective and compare it to the original three movies. A supernatural chest that when opened releases some ghostly supernatural energy that kills everybody not worthy? That's completely ok. A secret cult that can, among other things, rip the heart out of somebody's chest with bare hands, while the victim remains alive through the whole process? No problems. A cup that grants eternal life? Completely normal.

But aliens? Nooooo! That goes too far! Aliens is way too much! And the fridge too!

What's happening here is a really strong case of nostalgia filtering. The people who complain about the fourth movie are mostly people who saw the first three movies as kids and loved them. It excited their adventurous imaginations. Now, as adults, they have these nostalgic warm memories of those times, when movies were wondrous escapes of reality into fantasy worlds full of adventure. Now as adults they cannot get that feeling anymore, and they are much more critical of movies. Therefore if a new sequel is made for the movies they loved as kids, they will be extremely critical and cynical about them. There's no way the new movie can grant them the same excitement and wonder.

If the fourth Indiana movie had been made back then, alongside the first ones, with all the aliens and fridges and whatnot, the attitude towards it today would be completely different. Today's adults would remember it fondly and consider it a good movie.

And conversely, if for example the third movie had been done now, people would hate it, even if it were identical. People would always find some ridiculous things to complain about.

Wednesday, August 29, 2012

When will we see an actual Batman movie?

The 60's Batman TV series was basically a farce. It was basically a time when, for some inexplicable reason I cannot comprehend, TV and film producers thought that a superhero series/movie should be a wacky comedy and be really over-the-top. (I really can't understand where the connection between "superheros" and "comedy" comes from. In my mind there's a disconnect of the size of the Pacific ocean between them.) In fact, that mentality persisted for a surprisingly long time (even in the 90's and 2000's we were still getting superhero movies that were more comedy than anything else.)

Tim Burton's 1989 Batman and its sequel are an attempt to make an actual Batman movie (which sadly sunk once again into a comedy in the hands of Joel Schumacher, a turn of events that's best left forgotten in the annals of history.) It was ok'ish... kind of. Yes, it was something resembling Batman, but... not really. Batman's suit is not like that, he is not really like that as a persona, the Joker is definitely not like that, the universe is not really like that... It's just not Batman. It's something that resembles it, but isn't really.

Christopher Nolan made his own version of Batman as a trilogy between 2005 and 2012. While this trilogy is highly praised and some of the most profitable and most well-received movies of all time... I'm sorry to say, but it's once again not Batman. It's something that resembles Batman, but just isn't. It's gritty, it's badass, it's kind of realistic... but it's just not Batman. It's like a parallel universe Batman that closely resembles the actual one, but just isn't the actual one. And once again the Joker is good on its own right... but it's not the actual Joker. It's something that may resemble the original, but just isn't.

(And again, aaargh, the costume. Why do these people insist in putting Batman into a bulky full body armor that makes him like the Michelin man who can't even turn his head? How exactly is he supposed to fight anybody inside that? It's not possible. The Batman in the comics does not use a rigid full body armor because he doesn't need it. He's that badass. He dispatches enemies using stealth and psychology, not by standing in front of them waiting for them to shoot. He's a ninja, not a human tank.)

Another thing that all these movies completely forget is that Batman is a detective. (After all, he's supposed to be the greatest detective in the world.) This aspect is completely lacking in all the movies. There's just no trace of it.

The closest thing we have got to an actual Batman "movie" is the Batman: Arkham Asylum and Batman: Arkham City video games. Now that's what I'm talking about. Here's a gritty(ish) Batman and cast of allies and enemies that looks, feels and acts like the real Batman, while still maintaining the kind of "innocence" that's part of the fantasy world of the Batman universe in the comics. Here Batman is Batman, the Joker is the Joker, Catwoman is Catwoman, and basically every single character is the character from the comics, and the entire setting is that of the comics. There are no stupid changes to make it "more realistic" or "more plausible" or anything like that. This is the comics Batman, pure and simple. No compromises, no bullshit. Just pure unadulterated Batman, no more, no less.

That's what I want from a Batman movie.

Tuesday, August 28, 2012

Annoyances when searching the net for info

This is a really small thing... but I think every software developer has been there, and it can get pretty frustrating.

If you are a long-time developer, you have most probably experienced it: You encounter a problem (like a really strange error message, or a strange bug with some library that you just can't understand eg. because the documentation of the library is lacking or other reasons) and you try to search for a solution online. Surely others have had the same problem and solved it.

Very often this is so, and you usually find the answer in the first few google hits. Sometimes, however, you will see someone asking the very question you are looking for, and then answering their own post with just "never mind, I found the solution", and never explaining what the solution was. You are left with nothing.

A lesser form of this is when someone asks the question, another person answers it, and the first person just answers with a "thanks, I will try that to see if it works", and never following that up with a report of whether it did work or not. This is, of course, a much more minor form because you can try the solution yourself. However, it's still a bit annoying because if the original poster had reported that the solution works, you would be surer of it before starting to test. (After all, the person who responded could be simply guessing, and be wrong.)

Why not try the solution first, and then thank the person who gave the answer?

Saturday, August 25, 2012

Programming job interviews

One thing I detest about job interviews is that you have to lie even if you really mean to be honest. You have to lie in order to convey your true skill properly. (Not that I have extensive experience on job interviews, but this is from what I have gathered.)

For example, suppose that you are an experienced programmer and have a good grasp of how imperative/OO languages (either compiled or scripting) work, and have extensive experience on some languages, but only a very modest understanding of PHP in particular: You know the basics, you have perhaps written a hundred of lines of it in total, but you know how it works and what it offers. Most importantly, if you had to, you could quickly learn to use it proficiently and competently.

However, job interviews don't generally ask you that. Instead, they ask you how much you have programmed in PHP.

You have two choices: Tell the truth, or "stretch it a bit".

If you tell them that you have only minimal experience of PHP in particular, they will probably mark you as not a very good candidate for a PHP programming job. Your assurances that you can learn the language quickly and that's not just BS will probably not help much.

The other possibility is to outright lie: You can claim that you have programmed in PHP quite a lot.

In a sense you are not "lying" per se. Rather, you are answering the question that they really want to ask, rather than the question they think they want to ask. What they really want to know is how easily you could start programming in PHP, not how much you have programmed with it in the past. (Of course having a lot of experience in PHP programming always helps, I'm not saying it doesn't. However, even more important is how much programming experience in that type of language you have overall, not how much you have in PHP in particular.)

However, in order to convey your true expertise you have to lie. The bad thing about this is that you can get caught redhanded. If they start asking some minutia about PHP you might not know the answers on the spot, and you will end up looking like an opportunistic liar.

They might well end up hiring someone who has programmed more in PHP (or at least claims to be) but who's not very good at it.

Thursday, August 23, 2012

TV live show editing

Watch this comedy routine by Abbott and Costello performing their famous "Who's on first" sketch. Watch it fully and then come back, as I have a couple of questions to ask about it.

Question 1: How many times did they show the audience?

Answer: Zero times.

Question 2: How much did it bother you that they didn't?

If you are a normal person, I am pretty sure that you didn't even notice this until I drew attention that fact. It certainly did not bother you at all.

If this were being televised today as a live show, at least 50% of the footage would be showing the audience reactions. This is something that bothers me to no end in today's TV show editing.

If I'm watching some performers doing an act (be it comedy, magic, juggling or whatever), I want to see the performers. Why would I want to see the audience? What possible interest would I have in that?

Of course it's not the act of showing the audience itself that's so bad. It's the fact that I do not get to see the performer while the audience is being shown, and in fact the performance is being constantly interrupted, which is really, really annoying.

Most performers have practiced their routines over and over in order to make an enjoyable viewing experience for their audiences. Everything they do is to entertain the audience. (It would be quite catastrophical if the audience got bored.) Hence every single thing they do, from beginning to end, is for the benefit of the audience. Every single thing they do is highly rehearsed and trained to be as interesting and enjoyable as possible.

And then TV directors butcher this highly polished act into bits and pieces, censoring half of it, completely destroying it. In the worst cases I have seen they show the audience even more than the performance itself, even in the middle of a routine. (This is especially annoying with performances that are long and continuous, without pauses, such as juggling.)

In fact, this practice bothers me so much that whenever I see eg. a YouTube clip or whatever of a TV show where there's this kind of editing, I just stop watching it. I simply can't stand it.

Why do they do this? Don't they understand that they are destroying the performance and annoying the viewers? Yet they keep doing this over and over, and have been doing this for decades, as if it was some kind of good live TV editing.

Wednesday, August 22, 2012

Graphical user interfaces going bad

Once upon a time, when the industry had a good decade or two of actual user experience on graphical user interfaces, a set of good design rules were established. Most operating system development companies even had their guidelines for developers on how to create a standardized GUI for their programs so as to make them as easy and intuitive to use as possible.

These are mostly small things, but they are important. For example, if a program has a menu (as most graphical programs do), it should always be located in the same place in all programs (at the top, below the title bar) and there are certain menus that should always have the same name (such as "File" and "Edit") and contain the same basic set of commands (such as "Open" and "Save"). If a menu element does an immediate action, it should be named without any punctuation (eg, "Save"), but if it does not immediately do something but instead opens a dialog where more options can be specified, its name should use an ellipsis (eg. "Save as...") Dialogs should always have a certain set of buttons named in a certain way (and, in general, they should always have a button to cancel the action.)

And so on and so forth.

The purpose of all these rules and guidelines is to unify all programs, make them use a standardized format and layout for common tasks, and thus make it as easy as possible for users to learn to use a new program. Also, the rules are intended to make it easy and intuitive to know what something does (for example, as mentioned, if a menu element does not immediately perform an action but opens a dialog, and hence it's "safe" to select it without the danger of it doing any modifications, its name will have an ellipsis in it. This is an intuitive and standard way of knowing this, with a very small formatting guideline that might feel insignificant in a completely different context. It's all these small things that make a good GUI a good GUI.)

There are also many guidelines and principles on how to design a good GUI on a higher level. An example of such a principle is "if you feel the need to add a text label explaining the usage of some GUI element, you are probably doing something wrong". The usage of GUI elements should be, in general, intuitive and easy to understand without textual explanations.

Other useful guidelines include things like what color schemes to use in the application. These are often aimed at making the application easy to use for beginners and people with disabilities (such as poor eyesight.) For example, making gray text on slightly grayer background can be a bad idea because people with poor eyesight may have difficult time reading it. (The ability to distinguish between low contrast elements is often poorer with old people.)

Whatever your opinion might be of Microsoft, Windows 3.x was actually pretty decent in terms of GUI design. All standard window and dialog elements were clear, consistent, and easy to understand and use, and if a program closely followed the standard guidelines of Windows GUI design, it was significantly easier to learn and use than a program that deviated from it.

Lately, however, it seems that many companies are completely ignoring these useful guidelines, and aiming for something that they seemingly think "looks cool" rather than is usable.

A very common trend seems to be to hide things from the user by default. This trend probably started around Windows 95. While Windows 3.x always showed file names in full, at some point newer Windowses started hiding file name extensions by default. In fact, they started hiding almost everything by default: You only got a name (without extension) and a program icon.

That's right: A program icon. Not an icon representing the file type, but the icon of a program (the program that would open it.) This means that if you had two files with the same name but with different extensions (like for example "image.jpg" and "image.gif"), both of which were opened with the same program, they would be completely identical in the file browser. Not visual distinction whatsoever. It's impossible to tell them apart without doing some additional operations to find out. This is most certainly not good GUI design.

In Windows ME one of the most braindead ideas ever to come out of any company in existence was defecated by Microsoft: Let's hide "unused" menu items in menus. And, while we are at it, let's reorder the unhidden ones for good measure. This goes against all good GUI design principles in existence, it's a horrible, horrible idea, and should have never been even thought. Menus become basically the antithesis of what good GUI design is.

In Windows 7 Microsoft went even further with all this "let's hide everything from the user" ideology: Now they hide menu bars and title bars by default. This is supposed to make programs look hip and cool. However, it significantly decreases the usability of everything because now you have to do extra steps to get to a menu, and you don't get any useful information that a program may put in the title bar. (For example, web browsers typically put the title of the web page there, and text editors the title of the document you are editing.)

(I have never understood why Microsoft hates menus so much. They seem to be doing their hardest to get rid of them. What exactly is so wrong with menus? They are clear, intuitive and easy to use, they categorize actions in a hierarchical and intuitive manner, and they don't clutter the GUI because the dozens and dozens of possible actions are neatly packed into expansible menus. The more actions a program can perform, the more important is for it to use menus for this, rather than something else, especially since menus are a good solution and have been used for this purpose for many decades.)

One of the key GUI design principles is that buttons, clickable icons and anything that can be clicked should look like it, and if they are disabled, they should clearly look disabled (by usually being colored in dim grays). This is another rule that Microsoft has liked to break for a long time: Clickable icons are no longer distinguished from just decoration because their borders are hidden (are we seeing a pattern here?) unless you hover them and, what's worse, in some cases they are even colored in grayscale until you hover them, making it harder to distinguish if they are disabled or not. This is not how you do good GUI design. You shouldn't have to hover anything in order to see if it's an enabled clickable element. It should be obvious as-is.

Perhaps Apple knows better than this? Nope.

For example, open a new tab in Safari. How do you close this tab? It's not immediately apparent... It turns out that the close button of the tab is completely hidden by default, and you have to hover the tab in order to make it visible. Apparently this is supposed to be hip and cool, but it's a completely counter-productive design that makes the usage of the program less obvious for no benefit.

The current window and the other windows used to be clearly distinguished from each other in the first versions of MacOS X. However, as newer and newer versions have been published, this distinction has been made more and more difficult to discern. At this point the active window has almost the same title bar coloring as every other window, making it very difficult to see which one is the active one currently. Why have they done this? No idea. It only makes using the system harder, for no benefit.

Recently Apple announced that they would hide scrollbars by default. This was a real facepalm moment for me. I was like, "WTF? Are they going to hide everything in the future? Just don't show anything. What exactly is the point?" What possible use there would be in hiding scrollbars? No longer can you see where in the page you are by looking at the scrollbars, without having to perform additional actions to make them visible. This is almost as much a braindead idea as what microsoft did with menus in Windows ME.

Saturday, August 11, 2012

Approval of vigilantism and murder

I stumbled across a news footage video that had by chance caught the murder (or attempted murder, I'm not completely sure) of a captured child kidnapper and possible rapist by the child's father. Some police officers were escorting the perpretrator in handcuffs, and the father was disguised in some kind of public telephone booth, from which he proceeded to shoot the perpeptrator with a gun. Clearly it was not something that was done in the heat of the moment, but something planned and premeditated.

I made the error of reading some of the youtube comments. In the first several pages every single comment, every single one of them, praised that father's actions. Most called it rightful justice, some called him a hero. Not a single comment of disapproval.

This is just crazy in my opinion. I see two major problems with this:

1) His son had been through a horrible experience that would probably haunt him the rest of his life. He was probably emotionally destroyed and in severe need of support. What does his father do? He risks everything and is ready to go to jail, possibly for life, away from his son. What possible good would that have done to anybody? Not only was his son experienced a really traumatic event, but now his father would put him through another, namely his own father going to jail rather than be at home supporting him?

That man is not a hero. He is an idiot. I don't care how distressed and mad he was about what happened to his son, he is still an idiot. I have no sympathy for him. Making things potentially so much worse for his son after such an ordeal deserves no respect nor admiration.

2) People's basically unanimous approval of his actions is just preposterous. This was not self-defence, nor was it an overreaction done in the heat of the moment. It was premeditated, cold-blooded first-degree murder. The man had clearly prepared for the situation and planned his actions. Yet most people think that what he did was justified, and a good thing.

I can't believe that in the modern society people are so eager to defend murder, vigilantism and taking matters to your own hands. Screw fair trials, screw basic human rights, screw law and order. Someone murdering someone else in revenge is ok according to these people.

And no, I'm not exaggerating here. I commented on the video about this, and several people defended their position, and in fact emphasized it. Some even said that this kind of action should be legal.

What kind of world do these people envision? We have long ago left behind the times when the people of a village stoned someone to death that they didn't like, and rightfully so. Murdering people in revenge, especially people who have already been captured by law enforcement, is barbaric.

Someone asked me "don't tell me you wouldn't have done the same thing in his position". I answered along the lines of: No, I would have not. I do not believe in the death penalty, and I especially do not believe in vigilantism and taking matters into my own hands, especially when the perpetrator has already been captured. If you support death penalty, that's your pregorative, but if you want death penalty in your country, you should impose it through the proper democratic channels, in other words, by voting or by becoming a representative. You do not take matters into your own hands, become a vigilante and start murdering people you don't like, bypassing the law and authorities.