Sunday, September 30, 2012

Evil Dead 2

There seems to be a really strange consensus that the movie Evil Dead 2 is better than the first The Evil Dead movie. Having seen both several times, I just can't comprehend the reasoning.

The first movie is a very low-budgeted pure horror film. Regardless of its extremely low budget, it's really well made. The authors really utilized every limited resource they had to make the best film they possibly could, and it really shows. Many of the special effects might be simplistic and antiquated even by the standards of the time, but they are surprisingly well made and effective taking into account what they had to work with and how little money they had.

In short, as a horror film it's really effective and well made. It's gory, it's gritty, it's gruesome, it's seriously made, and it doesn't shy showing you the goriness in full detail.

The second movie is not a sequel. It has a relatively short segment at the beginning that's a kind of remake of the first movie, changing many details (such as the amount of people who went to the cabin the first time), and then there's a continuation to this half-remake, which constitutes the rest of the movie. (AFAIK the reason why it's not a pure sequel and why they "remade" the first movie with changed details as an introduction has something to do with the authors having lost the rights to the original movie or something like that.)

The three major problems with the second movie are that it's not a horror film but instead a comedy ("horror comedy" some would say), it does not take itself seriously and instead goes for a supposedly "badass" protagonist (that's more hammy than "badass"), and unlike its predecessor it censors itself from showing the gorier scenes, which makes little sense.

In short, it's like a bastardized version of the first film that has been remade and self-censored to get a lower MPAA rating, and which substitutes pure gory horror for slapstick comedy. And for some inexplicable reason most people think it's better than the first film!

They are going to release a(nother) remake of The Evil Dead in 2013. I already know it's going to suck because I'm almost completely sure that they will make it like the second movie instead of like the first one, just because the second one is considered better. Kudos to them if they make it a pure horror movie, but I'm not holding my breath.

Thursday, September 27, 2012

Firefox version numbering

Version numbering of software products is far from being a standardized thing, but the most common convention is to have something along the lines of:

<major version>.<minor version>


For example the version number "2.1" means major version 2, minor version 1. (Generally the major version starts from 1 and the minor version from 0. A major version 0 is often used to denote an alpha or beta version that's not yet complete.)

The major version number usually indicates some kind of significant milestone in the development of the program, and is usually accompanied by significant improvements or changes. Sometimes it could mean a full (or significant) rewrite of the code base (even if outwardly there's little visible change). Regardless of what exactly is it that has changed, it's usually a very significant major change (either internal or externally visible).

Some projects keep the major version so significant that they hardly ever increment it, and reserve it for really huge milestones (such as rewriting the entire application from scratch or changing it so much that it's effectively a completely different application, even if its purpose remains the same.) The Linux kernel is a good example of this (which was only incremented to 3 recently, even though it has been under constant development for over 20 years.)

The minor version is usually incremented when new features and/or a significant amount of bug fixes are introduced. In many projects even relatively major new features or improvements only cause the minor version to be incremented.

Some projects use even further minor version numbers. A typical custom is to use a third number (so version numbers will look eg. like "2.5.1") which often denotes simple bug fixes or cosmetic changes, but generally no new or changed features. Some even use a fourth number (such as the abovementioned Linux kernel) for an even finer distinction between bugfixes/cosmetic changes and actual feature changes (so that such changes can be divided into minor and minor.)

Anyways, at the very least it's a good idea to use a two-numbered versioning scheme to indicate a clear difference between major and minor changes. This is very informative. When you see the major version number increase, you know that something really big has happened, while the minor version number just indicates some more minor changes.

The Firefox project used to use this kind of version numbering for almost a decade (with the major version slowly incrementing from 0 to 4 during that time). This was quite informative. When the major version increased, you knew that there would be some significant changes to the browser.

Then, for some strange reason, they threw that away. Now they are using the major version number for what almost every other project in existence uses the minor version number for, and they practically keep the minor version always at 0. They keep incrementing the major version number each month, just for some minor improvements and added features.

I do not understand the rationale behind this. The version numbering of Firefox has practically lost its meaning and is useless. It's not anymore possible to tell from the version number if one should expect really big or just minor cosmetic changes.

Moreover, if they ever make a really significant change or improvement to the browser, nobody will hardly notice because the version numbering has lost its meaning and it cannot be used to convey this information anymore. No more will you see headlines reading like "Firefox 3 released, major improvements." It will just be "so they released Firefox version 57, so what? What did they change? The color of the close tab button?"

Saturday, September 15, 2012

Pseudointellectualism

In my old "blog" (of sorts) I have written extensively about conspiracy theories and believers in them, and the reasons why people believe in them.

One aspect of this is, I think, that believing in conspiracy theories is a form of pseudointellectualism. Especially people who have memorized hundreds and hundreds of arguments and can flood a discussion with them in a form of rapid-fire and shotgun argumentation, probably get a sense of being quite smart and "educated": They have the feeling that they are experts on the subject in question and possess a lot of factual knowledge about it, and thus can teach it to others and use all these "facts" to argue their position and win any debates.

In other words, they are pseudointellectuals. They feel that they have a lot of factual knowledge on the subject, and they might get a sense of intellectual superiority, even though in fact they are just deluded. They are often good at debating and arguing their position, but they do not realize that they are spouting nonsense. Of course this nonsense is wrapped in tons of seeming logic and apparent valid arguments, which superficially might sound plausible when you don't examine nor study them too profoundly, but it's still just nonsense.

The same is true not only of conspiracy theorists but also of creationists, ufologists and believers in the paranormal. (When we examine all these positions closely, there actually isn't much difference between them. They are all basically religious belief systems.)

I think that it's precisely this sense of intellectual superiority that makes at least some of these people so firm in their beliefs. They might not admit this even to themselves, but deep down this feeling of superiority makes them feel good, better and more "educated" than other people.

In fact, some pseudointellectuals use precisely their own sense of intellectual superiority to belittle their opponents.

Perhaps the most prominent example of this, a pseudointellectual on a completely different level of its own, is William Lane Craig. He feels superior for having an education on "philosophy" and has several times belittled his opponents for not having such an education and therefore not having the necessary "qualifications" to debate him on the same level.

This is just an outstanding level of douchebaggery. An incredible amount of smugness. And this even if philosophy were any kind of relevant field of science (which it isn't, which just makes his attitude even worse.)

I don't think it's easy to surpass this level of pseudointellectualism.

Tuesday, September 11, 2012

Show, don't tell?

"Show, don't tell" is one of the rules of thumb of proper storytelling in a visual media (such as movies, TV series and comics). It means that, in general, it's better to show something happening rather than just telling what happened. It can apply even to written stories, where it means that the events should be "shown" as a narrative, rather than being explained.

This is not, of course, a hard rule. Sometimes it's better to just tell something as a quick summary rather than going to the lengths of actually showing the events in full. Too much "showing" can actually be more boring than just quickly telling what happened.

What grinds my gears is when people use the "show, don't tell" argument to criticize works of art in situations where it really doesn't apply.

There are excellent examples of things not being shown, just hinted at in dialogue. For example, consider the famous "hamburger scene" in the movie Pulp Fiction. The dialogue, and in fact the whole situation, makes reference to those people having screwed up Marcellus Wallace somehow, yet we are never shown what happened. We are only told that it happened. However, we don't need to be shown. In fact, the scene would actually not be as good if it spent time showing what happened, rather than just telling about it through the dialogue. As it is now, the scene is just superbly done. (In other words, rather ironically, the scene becomes better when it does not blindly adhere to the rule, and instead breaks it.)

I think that sometimes people use the "show, don't tell" as an excuse to criticize a work of art that they don't like. For example, an acquaintance of mine criticized the speech made by the Architect in the second Matrix movie for breaking "show, don't tell". I honestly cannot understand what exactly he was suggesting. I think the speech is just spot-on and does not need any "showing". (I brought up the Pulp Fiction scene as a counter-argument and, according to him, that was different. I was unable to get a clear explanation of why.)

Monday, September 10, 2012

Necroposting

At least 90% of internet forums out there have a strict rule against so-called necroposting. This is defined as responding to a thread that has not had any activity in a long time. (The amount of time varies from forum to forum, ranging from years to just a few months.) Necroposting is somehow considered a really bad breach of netiquette or something. If anybody necroposts, an angry swarm of people will immediately castigate the culprit with angry reminders that the original thread died several months ago! In fact, a few forums even go so far as to automatically lock threads that have not had any activity in a given amount of time.

I have never understood (nor will ever understand) what exactly is so bad about "necroposting". None of the arguments given against it make any sense.

So what if a thread has not been active in many months, or even years? Someone might still have something new to add to it. It could be a new perspective, a new idea or even an update of recent events related to the subject of the discussion. Posting it in the thread that discussed the subject keeps it in context, keeps the forum better organized and doesn't scatter discussions on the same subject on different random threads.

Someone might respond with something that has already been said in the thread (eg. because of only reading the first post and responding to it, instead of checking the rest of the thread first). However, this has nothing to do with necroposting. People do that all the time even in threads that have had abundant activity even during the last hour. Reprimanding such a person for doing that just because the thread happened to be old (but not doing so if the thread happens to be recent) is irrational, inconsistent and outright mean.

Fortunately there are some online forums that do not have any such nonsensical rule. Yet even in those you can see people complaining if someone necroposts (and then regular users telling them that "we don't have such rule here".) The irrational hatred of necroposting wants to spread even to forums where it doesn't exist...

There is also another, potentially positive aspect to "necroposting": It draws attention to old threads that might be of interest to new people. When a forum has thousands and thousands of threads, nobody is going to wade through all of them. If someone, however, makes a post to a years-old thread, it's often because the thread is interesting, and the post will bring it back to the top (or at least to the list of threads with unread posts.) Some people who had never seen that thread may find it interesting. (And this isn't just speculation. Recently in a forum I witnessed exactly this happen. Someone "necroposted" on a very old thread, and somebody else commented that the thread was actually interesting, and expressed gratitude on drawing attention to it.)

The aversion to necroposting needs to die. It makes no sense.