Monday, June 29, 2015

Harassment of women in online video games?

One of the biggest narratives of this modern feminist anti-gamer movement is that women always and constantly have to face harassment when they play online games.

That's of course bad in itself, when it does happen. However, there's a degree of dishonesty in that complaint: It completely disregards the fact that men, too, receive lots of harassment in online gaming. In fact, study after study has shown that men actually get on average more harassment than women. However, this is ignored or even outright denied by feminists. (In fact, a common claim is that a man does not have to fear harassment in online games. This is absolutely and completely false. Yet these feminists get away making claims like that, and lots of people, even many gamers, actually believe them.)

What is the major difference between men and women who are the targets of harassment in online games? The difference is that men don't complain about it. Most of them couldn't care less. Many of them laugh at it (after all, most harassment is done by sore losers.)

This is why the harassment that men experience in online gaming (which is demonstrably more frequent than what women experience) is mostly invisible: The "victims" don't care nor complain about it. Dare I say, they take it like men rather than bratty children.

Is it right for people to harass other people in online games (or anywhere else)? Of course not. Is it such a huge deal as feminists want to portray? No. Just ignore it. The harassers are idiots and retards. Be the bigger person. Don't whine like a little child whose feelings were hurt.

Saturday, June 27, 2015

Transferring pokémon from one generation to the next

Catching as many pokémon as possible (ie. completing the pokédex, ie. the list of descriptions of all pokémon), and trading pokémon between two players have been staples of the games since the very beginning. And of course leveling up your "alpha team" of pokémon. It can be quite addictive and rewarding.

In the first generations, however, there was a problem: You could not transfer your pokémon from a game of one generation to a game of the next (eg. you couldn't transfer, at least not officially, pokémon from a first-generation game to a second generation game). This even though there wouldn't have been a technical reason that would have stopped that from happening. (Every pokémon game fully supports all pokémon of all previous generations.) This meant that you couldn't transfer your level 100 shiny ultra-rare legendary powerhouse from one game to the next, even though you spent so much time catching and leveling it up.

Generation IV (the first generation for the Nintendo DS) changed this, by making use of the fact that the first Nintendo DS had both a normal DS cartridge slot and a GameBoy Advance (which is what generation III ran on) cartridge slot. This meant that if you owned games from both generations, you could transfer your pokémon from the generation III game cartridge to your generation IV game, by connecting both cartridges to the console. (Even then it wasn't made a completely straightforward process. There was a mini-game involved to do the transfer, and you were limited to transferring 6 pokémon per day. Nevertheless, you could still do it.)

Transferring from generation IV to generation V (both being DS games) was more difficult, though. Impossible for most people. This is because it required having two DS consoles, and transferring them using their IR link. Unless you had a friend with another DS, you just couldn't transfer.

Transferring from generation V (DS) to generation VI (3DS) is much easier, though: Once again you only need one single 3DS device. You simply install the "poké bank" application, connect your gen V game to the console, transfer your pokémon to the "bank", then connect your gen VI game to the console, and transfer the pokémon from the bank to the game. Easy peasy. Only one 3DS needed, no hassle.

Except for one big thing: It's not free.

You see, in order to use poké bank, you need to purchase a yearly subscription. Sure, the price is relatively small (only $5 per year), but still...

I'm not saying they don't have the right to charge money for its usage, but come on. The games themselves are not exactly cheap (and unlike with PC and desktop console games, 3DS games tend to not to drop in price almost at all, even after many years). I have already bought not one but two of the games (rather obviously, else this transfer thing wouldn't make sense), neither of which was exactly cheap, and I would just want to transfer some data from one game to the other, and you are charging me for that?

They don't offer any cheaper shorter-term alternatives either. It's a full year or nothing. Yet I would only need this for something like an hour or less. The rest would be useless to me.

This is an online service (the pokémon data is actually transferred to an online server, rather than kept locally, which allows also to transfer them from one game to another without the need of two devices), and the yearly cost is for maintenance. However, this whole thing could support purely local transfers, without sending the data to an online server. But no, it doesn't support that.

Moreover, this is in fact the only possible way to transfer from gen V to gen VI. There is no other way (not even if you have two consoles.) Gen VI games simply do not support transferring directly from gen V games (eg. using the IR link.) Trading is also not possible between generations (and has never been. This makes sense because older generations do not support the new pokémon in newer generations, so it would be unworkable.)

As said, the games are not exactly cheap, and I have already purchased two of them. Why can't this be a simple free add-on? (At least support local transfers for free, even if an online service would be commercial.)

Friday, June 26, 2015

Strange pre-60's moviegoing practices

Nowadays, when you go to a movie theater, the protocol is rather simple and clear: You buy a ticket prior to the start of the movie, you enter the theater, and when the movie ends, you have to leave. (Most theaters won't sell you a ticket for a showing that has already started, although it depends on the country and the theater. In many theaters you are allowed to enter the showing in the middle of it, if you have a valid ticket, but also this varies. Many theaters stop accepting people after a certain time after the movie has started.)

This makes sense, and it has been so for a pretty long time. However, it hasn't always been so.

You see, prior to about the 1960's, at least in the United States (and many other places) most movie theaters operated quite differently. It wasn't a business model (nor a moviegoing culture) of "you pay to see one movie, you enter when the movie starts, you get to watch it, then you have to leave". No, it was a business model of "you pay for admittance, you can enter and leave whenever you like, and watch whatever happens to be on the screen at the time".

It was, in fact, surprisingly common practice for people to just go to a movie theater completely ignoring movie starting times, and just enter and watch the movie, even if it was in the middle of it. Then, after the movie was over, they just stayed and watched the first half of it, before leaving.

From a certain perspective this made some sense. You see, this was before the widespread (and even existence) of the TV, and movie theaters were (besides newspapers) the most common way of watching the news and visual entertainment. Newsreels were shown between the movies, and these were popular.

It still feels a strange practice, though. Watch the latter half of a movie, then some newsreel, then the first half of the movie, and leave. But that was quite common in the pre-60's America (and probably elsewhere).

You might have heard that when Alfred Hitchcock made his movie Psycho in 1960, he demanded that no people be admitted into the theater after the movie had started. It was precisely because of this common practice. It was also one of the most influential events that changed the practice to what it is nowadays.

Why must video games be more "politically correct"?

Assume that somebody made a high-budgeted big-profile video game where the player takes control of one or more Nazi officers in the Auschwitz concentration camp in 1942, and is tasked with handling the prisoners, with everything that it entailed, including the mass extermination part. This game would be completely "neutral" in the sense that it simply would depict the day-to-day duties of such officers in a very realistic manner, and there would be no political or ideological message, no "preaching". There would be no happy ending. (The game could end, for example somewhere around 1943, when the camp was still in full operation.)

Such a game would cause an outrage. Morality guardians, the media, and basically everybody would be all over it, classifying it as the worst thing that has ever come out of the gaming industry.

But why?

This kind of depiction (especially if neutral, like described above) is in fact more or less "allowed" for other forms of media, such as literature and film. Such a book or film, if well done, would be described as a gritty and harsh depiction of historical reality. Even if the work is not "preachy" nor tries to hammer down some kind of lesson or political idea, it would still be considered to be teaching some kind of lesson implicitly.

This is allowed for other forms of media even with non-historical, ie. completely fictional depictions of brutality and injustice. In other words, even if the work of art doesn't have the excuse of depicting actual history, but it's all completely fictional, it's still allowed (especially if it's well made.)

But not so with video games. For some reason video games are held to much, much stricter standards of political correctness.

To be fair, there is some leeway. There are some games where the player controls a criminal, and there is no punishment for said criminal at the end, nor any kind of preachy message that crime is bad. The Grand Theft Auto series would be the quintessential example. However, said series (and the other few similar games in existence) have come under heavy criticism from moral guardians. Much more so than any movie or book with similar depictions (which exist aplenty.) Heck, some books and movies that depict criminal life in a completely non-preachy manner are considered some of the best books and movies ever made.

But while there is a bit of leeway with games like Grand Theft Auto, a game like the one I described at the beginning would never fly. It would cause enormous amount of controversy. So much, in fact, that I don't think any game company of any recognition would dare make such a game in a million years.

Sometimes these heightened standards for video games in particular go to ridiculous lengths.

For example in Germany depicting the Nazi swastika is forbidden by law, except in historical books and films. With the exception of, you guessed, video games. In video games it's forbidden period, regardless of how historically accurate the game might be, or what the context of using the symbol is. It doesn't make any difference.

Recently Apple pulled out of their App Store all games depicting the Confederate flag, with the argument that it's "racist." This included all the American civil war games. Again, historical accuracy and context have no bearing on whether the game was pulled. And, most egregiously, Apple announced very explicitly that they are doing this with games only; they very explicitly stated that making political statements, using all kinds of symbols etc. is completely ok and allowed in their digital book store and in movies, but not in their app store. They quite explicitly stated that games are held to a different standard than books and movies.

That's completely right: You can publish a book with the Confederate flag on the cover in Apple's digital bookstore, even if the book endorses the use of the flag or the politics behind it, but you are not allowed to publish a video game of, for example, the American civil war, if it contains the Confederate flag within it, completely regardless of the context or what the message (if any) of the game is.

This is completely silly. Why are video games held to a different standard than other forms of media? Why the double standard?

Thursday, June 25, 2015

Disagreeing for no good reason

When I was in the military (yeah, military service is mandatory here in Finland), at one point there appeared in our barracks a box for making suggestions. I got an idea: The rooms didn't have mirrors in them. I thought that mirrors could be great, especially since we often had to put on ties for special occasions, and so on and so forth.

When I presented this idea to the other guys (there were something like 12 of us in total), to my complete puzzlement every one of them disagreed, and not a single one of them could give any rational reason why. They just disagreed, period. To this day I don't have the faintest idea why. I don't even have a layman's hypothesis of the psychology behind it. There would have been zero drawbacks in us having a mirror in the room (and if there had been, they would have certainly used it), yet they all disagreed, and couldn't give a single reason why. They just did, period. Thus we got no mirrors.

This isn't a unique case. Of course it's not tremendously frequent, but it does happen from time to time. Nowadays online it happens quite a lot.

EEVblog is a YouTube channel that publishes frequently videos about electronics. One common video series of the channel consists of "teardown" videos, where the host opens an electronic device and examines its interiors. The motto of this series is "don't turn it on, tear it apart".

He doesn't always follow that principle, and he occasionally does turn the device on before tearing it apart, but oftentimes he does not. In one video in particular I would have been interested in seeing the device in action, before he opened it. I made the suggestion that to make the videos more beginner-friendly and watchable, he could spend something like a half minute simply briefly showing the device in action, ie. what it does and what it looks like when it's on and is used. (This particular device, which was some kind of multimeter, had a display, so it would have been interesting to see what the display looked like when turned on, and what kind of things it would display and how.) Since the video itself was something like over half hour long, adding half a minute to it wouldn't have made it any significantly longer, but it would have made it more interesting.

Well, you guessed it. Almost every single person who responded to my suggestion disagreed. None of them could give a good reason why. (To be fair, in this case I got a couple of agreeing comments. However, they were inundated by dozens and dozens of disagreeing ones.) The thing is, if the original video had had that kind of short segment showcasing the device, I'm completely certain that nobody would have criticized it for that, but on the contrary would have liked it. (In fact, I'm certain that if then somebody had criticized the video for that, these same people who disagreed with me would have defended the video.) Even when I pointed this out, they wouldn't budge.

These are by far not the only examples that I have experienced, but I think they are illustrative. Sometimes, for some strange and random reason, lots and lots of people just disagree as a group, even when not one of them can present an actual good reason why. (It would be quite different if some people disagreed, while others agreed. That's normal. It's strange that sometimes all of them seem to get this strange urge to disagree for no good reason.) And it's quite random; you can never predict when this will happen.

"Victim blaming"

Consider this advice:

Whenever you connect to the internet with a computer or other such device, you are taking a conscious risk. Your computer may be hacked, your information may be stolen. While technology has worked hard to erase all this risk, and while a lot of progress has been made we are not still there. The risk of being hacked is always there, and you should be aware of it. However, there are things you can do to minimize the risk. Use firewall software, keep your anti-virus and anti-malware software always updated, never send critical information (such as credit card information) over an unsecured connection, never send private personal information through email, skype or any other form of unsecured communication, do not answer to emails or other communication offering you deals that are too good to be true (such as somebody offering to transfer millions of dollars to your bank account, or any other "get rich quick" schemes.) If a strange email has an attachment, do not open it.

I think you would agree that's good basic advice that everybody should be aware of.

Now assume that after I have given that advice somebody said to me: "That's victim blaming! You are a horrible person, blaming the victim! Don't tell me to use secure connections, tell the hackers not to hack me!"

I think that you would probably be as flabbergasted as I would. Is that person completely insane and delusional? How exactly is giving this healthy advice "victim blaming"? What kind of mentally retarded person on Earth could think of giving this advice being a horrible thing to do?

Well, let me answer that question: Feminists.

Recently there was this scandal of cellphones of celebrities being hacked, and their nude photos stolen and spread on the internet. When technology experts gave the advice of "do not take nude photos of yourself with your phone and send them to your friends, because as any piece of technology connected to the internet, they can be hacked", the feminist crowd shouted "victim blaming!"

Feminists have gone completely insane.

This "victim blaming" craze started with this mentally retarded idea that if someone tries to find patterns in rape cases, and to find measures that can be taken to minimize the risk of being raped, that someone is "victim blaming". ("Don't tell me what to wear. Tell men not to rape.") These people who try to provide useful advice are regarded by some feminists as worse than the rapists themselves.

This angers me to no end. These feminists are assholes. They are scum. If there's anybody that's worse than a rapist, it's these mentally retarded feminists who disparage people who are only trying to help, people who are actually trying to prevent rapes from happening. It's not the people looking for patterns and coming up with preventive measures who are worsening the rape situation; it's the idiotic feminists who disparage them. These feminists are actively working against preventive measures. It almost feels like they don't want to prevent rapes from happening.

When these feminists are confronted with this, they will often defend themselves by saying that the advice is useless. Whether it's useless or not has no bearing on you accusing them of "victim blaming". This is so insane that it's infuriating. When having this conversation, they will always resort to the "the advice is useless", and sometimes even acknowledging that yes, the advice is given in good faith and with good intentions... yet they later keep on with the "victim blaming" crap. They have this kind of strange double-think, where they acknowledge the innocence and good faith of the advice, and at the same time still think that it's "victim blaming" done in bad faith. This double-think is often so deeply ingrained that it could be considered a mental illness.

On top of that, their alternative is rather unworkable. "Teach men not to rape"? Would that work with other crimes as well? "Teach thieves not to steal." "Teach hackers not to hack." "Teach people not to murder."

Sure, education goes a long way to lessen the prevalence of crime. However, it will unfortunately never get rid of it completely. Also, and most egregiously, relying on education in exclusion to the preventive measures is just pure insanity. But that's exactly what these feminists are proposing: Forget the preventive measures, rely solely on some idealistic education scheme.

Also, is the advice really as useless as the feminists claim it to be? I don't think it is. Just consider this advice:
  • If you are going out to a night club, party or other such place, never go alone. Criminals, rapists and date rapists prey on lonely women because they are the easiest targets.
  • Be always aware where your friends are. You are protecting them as much as they are protecting you.
  • A male companion offers more implicit protection than a female one. This may reek of implied sexism, but you have to be pragmatic. Sexual predators are much more unlikely to target a woman with a male companion; that's just a fact of life.
  • Do not go alone with a man you don't know anywhere. Most men may be completely ok and nice, but that one day when you go with the wrong man... Just don't do it unless you are completely certain of what you are doing.
  • Be aware of the implicit social messages you are sending. Yes, this includes how you dress. Forget about the feminist outrage, be pragmatic: How you dress sends an implicit message to other people in our society. You just have to be aware of it. If you are a lonely woman wearing a sexy dress at a night club, some men will interpret it as an implicit message that you are looking for fun. You can choose to either be outraged about it, or you can be pragmatic about it. Be aware of it.
  • Random rapes (as opposed to date rapes) happen more frequently in certain parts of the city than in others. Be aware of this.
Is this really such a bad useless advice? Is it really so useless to, for example, avoid going alone to a night club as opposed to going with a group of friends? Sure, it may suck that a woman can't safely go alone to such places, but being aware of this fact and giving the advice is not a bad thing. It's a preventive measure.

Why is "do not go alone to a night club" victim blaming when, for example, "do not send your credit card information over an unsecured line" is not? This whole "victim blaming" thing is just a huge pile of bullshit.

What happened to id Software?

In the 90's id Software was one of the most influential game developers in the world. If talking about the first-person shooter genre, arguably the most influential.

While Wolfenstein 3D was somewhat of a hit, perhaps no other game in history has had such a big impact and influence in video gaming than their next hit, Doom. This was the first-person shooter. Heck, for several years all subsequent first-person shooters by other companies were called "Doom clones". (The term slowly changed to "first-person shooter", but for quite many years "Doom clone" was in widespread use.)

Doom had many technological innovations unseen at the time, and it hit the market at the right time.

And that was one of the staples and defining characteristics of id Software: Innovation. Their games were always technologically ahead of most competitors, and they were essentially the leaders of first-person shooter gaming technology, while all other companies were followers and imitators.

That's not to say that other companies didn't make technological innovations of their own. Of course they did. It's just that for many years they lived in the shadow of Doom, and later Quake, from the gamers' perspective. (For example, the engine used in Duke Nukem 3D was technologically significantly more advanced than even the latest Doom game engine, yet it still had to live in the shadow of the latter, always being compared to Doom.)

Their next big hit was Quake, which also had significant technological innovations. While certainly not the first truly 3D game (ie. with free geometry, movement and orientation), it was in many ways the most advanced and, especially, most efficient game of that type for a time. By this time the term "Doom clone" had been pretty much phased out, but other first-person shooters were often now compared to Quake, and had to often live in its shadow.

Quake was initially fully software-rendered, but had an OpenGL port later. However, the first "properly" hardware-accelerated game by id Software was Quake 2. Again, it introduced a stock of technological innovations, which most other games and game engines had to catch up to.

At this point id Software was also a big name in terms of game engines. While not the only reusable game engine around, theirs was among the top most popular ones. Quite many games used the Doom, Quake and Quake 2 engines. By this time they had a big "feud" with the other biggest competitor, the Unreal Engine, but they were still one of the biggest.

However, something happened some time between Quake 2 and Quake 3 Arena. While the latter was a popular game, it had not such a huge innovator, nor was its engine such a huge hit. It wasn't anymore the first-person shooter that overshadowed all others, and to which other games were compared to. By this point other first-person shooters could stand on its own, rather than being compared to the latest one by id Software. In other words, id Software was not the "leader" anymore, just another competitor in par with everybody else.

Again, Doom 3 tried to innovate, and it did in many aspects, and it was somewhat of a hit... but somehow it didn't attain the same position as the first Doom did. Quake 4 went largely unnoticed.

By this point id Software's game engines had lost most of their market share. The big name was Unreal Engine, and some other smaller engines. Somehow id Software's engines had been relegated to almost obscurity, and almost no other company used them. Neither their games nor their game engines were the market leaders anymore; other companies had surpassed them by a landslide.

id Software next tried to innovate with their next game Rage. It certainly did have many innovations... but they mostly seemed hollow and unimportant. While many of the new innovations sounded good on paper, they didn't really make their actual game stand out from the rest of the contemporary games, technologically speaking. In fact, it didn't look any better than other games using more "traditional" existing technology. (In fact, in some aspects it even looked worse.) The game itself was also bland, and didn't have much success.

Only a handful of other games used its game engine. By this point Unreal Engine is the de facto first-person shooter game engine. The amount of games using Unreal Engine is staggering. Also other smaller engines had surpassed id Software's popularity, and by quite a lot (such as the Unity engine.)

Try as they might, it seems that id Software just can't regain their former glory, their former leader position, where everybody looked up to them, and everything else was compared to their games. Other game engines and games surpassed them, and went far ahead. Perhaps other companies understood what was practical about developing games and making them look good, rather than being so "innovative" with exotic new features.

At least id Software is still alive and kicking, but they are trying to catch up with the leaders, with very good efforts, but somehow failing. They just aren't the big name they used to be.

Wednesday, June 24, 2015

Making legally-unenforceable contracts with impunity

I am most certainly not a lawyer, and some of the things I'm writing here may be wrong, so keep that in mind.

Many companies have a tendency of claiming rights they don't legally have, or putting restrictions on their clients that can't be legally enforced. As a concrete example, Finnish law is unusually clear on the question of whether you are allowed to make backup copies of software you have a current legal right to use: Absolutely yes. Moreover, any usage license that says otherwise is explicitly mentioned to be unenforceable (in other words, it has no legal force behind it, and can't be enforced by law, even if you fully agree with the license. You agreeing with the license does not override your legal rights.)

Yet many software usage licenses explicitly forbid making backup copies of the software. As said, even if you agree with the license, you can still ignore that limitation and make as many backup copies as you want. (Your right to those copies ends when your rights to the original end, naturally.)

Another, perhaps more minor, example is that, at least some years ago, many websites had a usage license that stipulated that you can't deep-link to any of the sub-pages within the website, only to the main page. Again, this is legally unenforceable basically in every country. It is unenforceable here (which is especially egregious when it's a Finnish website of a Finnish company, and the usage license clearly refers to Finnish law.)

This sometimes goes beyond just generic usage licenses. Many companies sometimes make personal contract agreements with individuals, where they claim rights or try to put restrictions on that individual, that couldn't be legally enforced even if said individual would break the agreement on those points. For example, certain non-disclosure agreements in certain contracts can be legally dubious, if not even outright unenforceable. (For instance, agreements of the kind "we'll pay you money for making a positive review of our product, and you agree to full non-disclosure of this contract, ie. you won't tell anybody that you have made this contract with us.")

This practice is very deceptive. Most people who read the agreement get the impression that by agreeing with it, they are bound to obey those stipulations, and might not know that legally they don't have to. Thus companies get extra benefit from this, effectively bypassing the law, and the rights of the individuals. This is gaining (often financial) advantage by deception; in other words, pretty much the definition of fraud.

The thing is, as far as I know, none of this is illegal. In other words, companies do not get into any kind of trouble for claiming rights they don't legally have, or putting limitations on their customers or clients that they can't legally enforce. They can deceive their customers and clients with impunity. The law doesn't punish companies for doing this (except perhaps in the most egregious high profile cases.) There is no law that says that companies can't do that (there are only, at most, laws that say that such stipulations are null and void, and can't be enforced, but without any repercussions to the company.)

Dishonest questionnaires and statistics

Assume that I had a political or ideological agenda that compelled me to prove that a good portion of the population thinks that murder is acceptable, and I want to prove this with a questionnaire. How would I do this?

If I were an honest person, I would pose the claim "murder is acceptable" and then provide two answers, "agree" and "disagree" for people to answer. Of course this method wouldn't give me the result I want, unless I'm so devious as to outright lie about the results. The vast majority (if not all) of the people will obviously answer "disagree". That won't do at all.

There are, however, other methods that can be used besides just outright lying about or distorting the results.

Suppose that instead the claim posed is "killing another person is acceptable" and now the range of possible answers is between 0 and 5, where 0 means "completely disagree" and 5 means "completely agree". What I won't tell the subjects is that the range is just an illusion, a diversion tactic, and that in reality I will be interpreting the results as there only being two answers: Zero and non-zero (with all non-zero answers meaning "I agree that murder is acceptable".)

This way I can trick people into giving me non-zero answers (which is what I want). When you give a range of possible answers between the two extremes, this introduces some nuance into the answer. After all, many people would agree that there may be some rare extreme circumstances where killing another person is acceptable, such as self-defense to save one's own life, or the police killing a criminal that's an imminent mortal danger to others, as the last resort. Such people would probably give a non-zero answer (most probably a 1, perhaps even 2).

But as said, this is all just a diversion tactic, a fraud. I will be interpreting all non-zero answers as "finds murder acceptable". This way I can make sensationalistic proclamations about the murderous nature of our society, based on actual questionnaires, with actual percentages. (Naturally I won't be proclaiming how I came up with those percentages.)

All this sounds really theoretical. The sad thing is that I pretty much repeated an actual recent "study" that employed this tactic. Not for acceptance of murder, but of acceptance of rape. This "study" found that something like 30% male college students thought that forcing a woman to have sex is acceptable.

Egregiously, if you actually dug the details of how they made the "study", it was exactly as I described above: A claim more ambiguous than simply "rape is acceptable", and a range of answers from 0 to 5, and the "study" interpreting all non-zero answers as "finds it acceptable".

I will perhaps never understand why social justice warriors are so eager to inflate numbers and distort statistics. It's dishonest to the core. And what's the purpose of it?

"Wire-fu"

The late 70's and the 80's were, in some way, the "golden age" of eastern kung-fu movies (mainly those made in Hong Kong and China.) These movies are often quite "campy" to the extreme, with severe over-acting, very simplistic plots, and lots and lots of over the top fight choreography (although exceptions to all these "flaws" exist, of course.)

There's one distinguishing feature in most (although admittedly not all) of these movies: All the fights are "real" in the sense that they are 100% fight choreography without any external aids or camera trickery. In other words, what you see on film is exactly what the actors / martial artists did in real life. Sure, it's usually not any "real fighting" in any sensible way (because it's often very over the top and choreographed), but everything is done "for real". These fight scenes are really fun to watch.

For some reason this art seems to have been lost. The vast majority of Chinese "kung fu movies" nowadays use wires (removed in post-production) and other aids to "enhance" the choreography to completely unrealistic levels. (This is called, often derogatorily, "wire-fu".) It seems that it has become the norm. And I really detest it. It makes the fights look completely unrealistic and unnatural. In other words, it breaks willing suspension of disbelief too much for comfort. It ruins the fight scenes.

I am willing to give a pass to movies that are clearly more in the realm of fantasy, such as Crouching Tiger, Hidden Dragon. But not to movies that are supposed to be "realistic".

Perhaps the most egregious offender in recent times is the 2008 movie Ip Man, which is a semi-biographical movie based on the life of Yip Man, a grandmaster of the martial art Wing Chun and master of Bruce Lee. The wire-fu just ruins the movie for me.

That's not to say there aren't any modern kung-fu movies with 100% "real" choreography. It's just that way too many of them use wire-fu.

Sunday, June 21, 2015

Story spoilers in video game reviews

This is written mainly with video game reviews in mind, but the same holds true to some (although a bit lesser) extent for movie reviews as well.

There seems to exist this very common and widespread notion that the review of a work of art (especially a video game in this case) has to contain a summary of its first act. In other words, a brief explanation of what the story is about.

I don't really understand this. That is a spoiler, plain and simple.

And I really, really hate spoilers with a passion. If I start playing a game, I don't want to know anything about it. Nothing at all. I want the story to come as a surprise, for it to develop as I play the game, and everything being new.

With video games in particular, the idea in the story is not a deciding factor on whether I'm interested in playing it or not. I can't think of a single example of a game that I would have thought that it could be interesting to play, except that the summary in the review sounded so uninteresting that I'll pass. That just doesn't happen. (This might be the case with some movies, but very rarely.)

A review of a video game can be perfectly well written without telling a single thing about the contents of the story. (Of course the review can allude to the quality of the story at the meta-level, but without revealing any details at all.)

What is more interesting in the review is to know if the game is actually fun and enjoyable to play. The story can be a big part of the gameplay experience, but I don't want to know what the story is, I just want to know if it's good (in the opinion of the reviewer).

Many reviews seem to include this kind of summary just because it kind of "belongs" to a review, because of tradition or something. These spoilers are completely unneeded.

Saturday, June 20, 2015

Are modern FPS games "dumbed down"?

One very common sentiment I see from old-time "hardcore" gamers (especially PC gamers) is that the vast majority of modern first-person shooter games are "dumbed down" compared to the greats of the 90's (ie. Doom, Quake and the other popular ones back then).

While I can appreciate what they feel, I really can't agree on the sentiment. In fact, I can actually present an argument for the exact opposite position.

Doom and Quake were essentially nothing more than shooting galleries. No story, no characters, no interaction with NPC's. It was simply one abstract level after another, levels with no storywise design or consistency (eg. trying to depict an actual location, like an actual spacestation or a factory), and your only mission was to kill enemies, push buttons and find the exit. That's it. There was absolutely nothing more to either game (nor to myriads of other similar games that are highly regarded by these people).

How much more "dumbed down" can you get than that? (Perhaps the only way would be to reduce the number of weapons and types of enemies.)

Compare that to modern first-person shooters. They have very elaborate stories, and very elaborate levels. Level design is invariably a lot more concrete and supports the narrative. While admittedly even today most FPS games consist of killing enemies, pressing buttons and finding the exit, most of them do present additional tasks as well. (In other words, while with some games you could argue that what you need to do is the exact same thing as with Doom or Quake, you can't argue that modern games are "dumbed down" based on that. At most they would be as "dumb" as Doom or Quake were.)

(Curiously, early alpha versions of Doom had levels that were less abstract, with actual recognizable locations like locker rooms, more clearly the insides of a spacestation, etc. These very early test levels also had fellow marines that would accompany you, and it was clearly planned to have an actual story of some kind. For some reason, however, in the final published version all the levels are highly abstract, with no real recognizable features, the real-object textures (such as locker doors) were replaced with more abstract textures, and the story was pretty much excised from the game. It was, dare I say, "dumbed down" quite a lot compared to those alpha versions.)

Many of these people talk about modern FPS games being "easier" than Doom or Quake. But are they? They seem to forget that both games had different difficulty settings, and they were quite trivially easy on the lowest settings. They also seem to ignore that a good portion of modern FPS games also have difficulty settings, and can be really hard on the highest settings, just like Doom and Quake were.

What I believe is happening here is that they found Doom and Quake to be really hard when they were young and inexperienced with FPS games, but nowadays they find FPS games relatively easy because they have decades of experience with them. They are comparing the experiences in their youth, when they didn't know how to play all that well, to their experiences today, when they have mastered the art of FPS gameplay. I think that they are not realizing this.

Another aspect of this is that modern games tend to implement lots of anti-frustration measures and tutorials that these old-time hardcore gamers find patronizing. I think this is a bit silly of a thing to complain about, because you can safely just ignore those things. (Also, many times the anti-frustration measures make playing a lot more enjoyable, by their very nature. Why would anybody want to play video games that are frustrating and unenjoyable?)

Then there's of course the question of linearity. Can the average modern FPS game be considered "dumbed down" because it's more linear than Doom or Quake were? Well, it depends on your perspective.

Doom and Quake were actually not as non-linear as people seem to want to think. Sure, each level was rather non-linear, often being quite open and requiring you to traverse to different parts of it to press those buttons and acquire those keys. However, the levels were relatively small in size, and the progress from level to level was purely and absolutely linear. At no point would you go back to a previous level to do anything. You passed one level, and that's it; you go to the next level and never come back. That is, in fact, quite linear gameplay (even though within the level the gameplay might not be as linear). When you think about it, it wasn't actually all that different from the average modern FPS game.

(And there are, of course, plenty of pretty non-linear FPS games out there. Those with a great deal of exploration to them, and where you can go back to any level you have already been in, and sometimes even have to go back for something. Alien: Isolation comes immediately to mind as a perfect example.)

When a game is too linear, it can be a bit bothersome. However, linearity allows for stronger storytelling, and when that part is well done, the linearity is not such a big deal. Sometimes a video game, even a first-person shooter, is a storytelling experience, and that's ok. Is such a game "dumbed down" compared to Doom and Quake (which had no story whatsoever)? Your mileage may vary.

Many modern FPS games, on the other hand, contain technological and game design innovations that FPS games of the 90's did not have. Stealth-based gameplay is a perfect example. Physics engines are another (especially when those engines are used for actual gameplay). Enemy AI has improved drastically compared to Doom and Quake (with enemy squads using tactics like flanking and taking cover, flushing you out with grenades, etc.) This is the exact opposite of "dumbing down".

In my opinion modern FPS games are not "dumbed down" compared to Doom, Quake or any of the other great 90's FPS games. In fact, I'm of the opposite opinion.

False memories

I have been wondering for quite some time now: How many of our memories are false?

I think most people have had the experience of disagreeing with a friend on some past event: You are completely and absolutely certain that the event happened in one way, and your friend is absolutely certain that it happened in another way. You can't both be right. At least one of you has to be wrong.

If both of you are certain of remembering the event correctly, then at least one of you is having a false memory. (Perhaps even both of you?)

Many people have experienced also another form of this: You are quite certain that some event happened in a certain way, and later you get undeniable proof that you remembered incorrectly. (The mildest case of this is remembering the events of a movie you saw long time ago: You are certain that something happened in the movie in a certain way, but when you watch the movie again years or decades later, you notice that you misremembered it. It comes back to you when you see it again, and recognize that you had a false memory of it.)

False memories may form in a multitude of ways. Simply remembering something incorrectly for whatever random reason is of course one of them, but not the only one.

In some cases you might have imagined an event, or read it in a book (and got a mental picture of it), or somebody told you about it, and then years or decades later you misremember it as having been a real event that you personally experienced.

Oftentimes we tend to "fill in the blanks" with our own deductions (or imagination) on incomplete events. Perhaps we witnessed only part of it, or saw only one side of it, and we fill the rest in our head. Again, years later we may misremember this extra filler as having been an actual event we witnessed. Often we outright forget some details of an event, and fill in the blanks in our head later. Many events can change drastically in our head this way.

Sometimes you misinterpret what you are witnessing. You might not understand completely what's going on, and make a (mistaken) interpretation of it. Then later you "remember" what you interpreted as having actually happened.

In a few cases we might want to believe something really happened in a certain way so badly, that with time we start believing that it indeed happened that way. We forget what really happened, and substitute the memory with something else. A form of auto-suppression. (This may be rarer, but it can happen.)

This makes me wonder: How many of my memories are false? Even those memories that I am 100% sure of? Could at least some of them be actually false? How would I know?

How sure are you that all your memories are accurate?

Friday, June 19, 2015

Taking VR headsets needlessly far

The Oculus Rift virtual reality headset has been in development for over 3 years now. There's nothing wrong in that. It's better that they polish the technology to be as perfect as is possible with current (and affordable) technology than to rush a half-baked product which will be clunky and work poorly, and then have upgraded versions coming to the market as they develop them.

Those years of development have not been for nothing. The headset has developed quite a lot compared to its first prototype. It has become smaller and lighter (although that has always been their goal), more responsive and accurate, and with additional features (such as detecting head tilting, and the physical placement of the headset, rather than simply its orientation, neither of which the original prototype supported).

Other companies have quickly jumped onto the bandwagon, even before the OR has even been released. (For example Sony has been developing their own version for the PS4. We'll see if that pans out.)

There is one trend in the development of the OR (and the competing products as well), however, that worries me: Too much effort put into gimmicky features. The kind of features that may make awesome presentations, but which are ultimately completely useless for 99.9% of users.

There's name for the completely useless gimmick I'm talking about: Augmented Reality.

"Augmented Reality" is the gimmick of the VR system taking live video of what's in front of it, and then adding CGI to it.

For example, they have hyped quite a lot about how you can set up the system so that you can walk around your room, and interact with CGI elements added by the system. It would be almost like walking in a Star Trek style holodeck, with the computer creating virtual objects and such for the user to interact with. Not only are they hyping this as a gaming gimmick, they are also promoting it as a designing tool eg. for architects, and other types of jobs requiring 3D modeling.

The thing is, Augmented Reality is a completely useless gimmick for the the vast majority of consumers. 99.9% of players will perhaps play with it for half an hour and then get bored. (In fact, I have my doubts about its usefulness in more serious applications as well, but since I'm not an expert on those, I can't comment.)

The OR (and other similar VR headsets) have a great potential for playing traditional video games in a significantly more immersive way. (Although it will certainly have its problems. Motion sickness is the most prominent of them. If you get motion sickness from just playing a normal first-person shooter, then you'll get it fifty times worse with a VR headset. You'll probably get motion sickness even if you are an experienced first-person shooter player. It's probably something that one gets used to, though.)

And that's what they should be focused on: Playing traditional video games when sitting on your couch. Not this Agumented Reality crap. AR is a useless gimmick and the vast majority of people will not use it for anything. The vast majority of people will want to play actual games with the headset.

Why is concentrating on AR crap a problem? Because it needlessly increases development time, possibly the complexity of the hardware, and, consequently, the price of the device. You will be paying for a completely useless feature.

I appreciate that they are taking their time and want to make it right the first time, rather than publishing half-finished products and using the consumers as beta testers. However, I do not appreciate them spending time, effort and money on useless features that will only make the device more expensive for no benefit.

Poor Xbox One... update

I wrote in a previous blog post about the problems I saw with the Xbox One a bit over half a year ago. (You should read that post before this one because this is just an update on it. I'm not going to repeat the same points.)

It seems that Microsoft has really learned their lesson. There were two major features about their E3 conference that people clearly noticed:
  1. Not a single mention of the kinect. Not the device itself, not a single kinect game.
  2. The presentation was 100% about games, rather than multimedia.
This was, in a sense, a rather 180-degree turn compared to their console pre-launch presentations, where they constantly hyped the kinect and the multimedia capabilities of the console. That didn't sit well with the public, and Microsoft clearly learned their lesson.

There's one small problem with that, though: While Microsoft has not technically speaking abandoned the kinect (they have said that they have some kinect games in development), in practice it seems so. Which means that all the hundreds of thousands (perhaps millions) of people who bought the console while it had kinect as a non-optional peripheral bundled into it, are worse off. They paid something like $100 for a piece of hardware that will see little to no love from game developers, even from Microsoft themselves.

Also, many have commented that without the kinect the Xbox One is simply a slightly less powerful PS4. (Their hardware is indeed surprisingly identical, except that the Xbox One has slower RAM or something like that, which makes it slightly less efficient.)

Another new announcement they made was the backwards compatibility mode with the Xbox 360. In other words, in the near future (if not even right now) you will be able to play Xbox 360 games on the Xbox One. (The way it works is a bit quirky. You put the original Xbox 360 game disc into the console, which verifies its authenticity, but then it doesn't run the game from the disc. Instead, it downloads a version of the game from Microsoft, which has been repackaged for the compatibility mode, and runs that. While this is a bit quirky, it will probably allow them to sell Xbox 360 games from their online store more easily.)

People have commented that this is way, way too late. If this feature had been there from the very beginning, it would have most probably boosted sales of the console. It would have given a lot more incentive for Xbox 360 owners to buy the console. Now, over two years later, however, it's way too late. Xbox 360 games are rapidly becoming a thing of the past. While the feature may be nice for people who already own the Xbox One, it's unlikely to entice many non-owners to buy it now. Microsoft should have really went the extra mile and made it a launch feature, rather than coming over two years later.

Thursday, June 18, 2015

Why Affirmative Consent Laws are a bad idea

"Affirmative Consent" law means that if you are a man and have sex with a woman, you must explicitly ask her for consent first, and unless she explicitly verbally gives consent, you are a rapist.

These laws, as written and enacted, are completely unilateral. Notice that it's always "he must ask her for consent", never the other way around. These laws never require a woman to ask consent from a man. But never mind that; it's not the main point in this post.

Assume this scenario: You are a man, and you are so drunk that you pass out. Some woman then performs oral sex on you, without you knowing, being aware, or being able to do anything about it, because you are unconscious. Later she accuses you of rape.

According to the Affirmative Consent principle, she is right: You had sex (by legal definition) with a woman, you did not ask her for consent prior to the act, and she did not explicitly give you consent. Therefore de jure you are a rapist. Never mind that it was essentially she who raped you, rather than the other way around; according to the law, you are the rapist in this scenario.

What a far-fetched artificially constructed idiotic hypothetical, isn't it? That would never happen in reality. People are not that stupid. Surely they understand the spirit of the law, rather than staring blindly at its letter?

Except that it has happened already.

Welcome to bizarro world, where a man can be raped by a woman, and it's the man who gets convicted of rape.

Sunday, June 14, 2015

Why hiring quotas are a really bad idea

The current social justice zeitgeist is moving more and more towards the direction where our society will be forcefully "equalized" completely blindly. In other words, if there's a disproportionate amount of people of a certain demographic in, for example, company management positions, then this will be "equalized" by forcing that company to hire people of other demographic groups into those positions (regardless of whether they are qualified for it or not.) In other words, hiring quotas will be implemented.

This is not something that's still in the future. This is something that is already happening with some jobs at some places. For example, see: Woman Flunks Fitness Test, Gets Firefighter Job Anyway.

This is sometimes called "positive discrimination". That's quite an oxymoron if there ever was one. It's as insane as saying "positive theft" or "positive rape". Discrimination of people based on their gender, race or other such characteristics is never and will never be "positive".

There are several reasons why hiring quotas are a bad idea. Among others:

Increased incompetence

The purpose of a company is to be as efficient and proficient as possible. After all, the purpose of a company is to make money, to put it bluntly. This is not always a bad thing: Most of our technological progress has happened because companies have innovated and developed new science, technology and engineering. You wouldn't right now be using the computer you are using if it weren't for the hundreds of companies that have developed the technology over the decades and even centuries. And that's just one example.

To achieve this, companies need to hire the most competent people. The people with the most education, experience and/or talent in the field.

Hiring quotas undermine this. It would stop companies from hiring the most competent people, instead forcing them to hire (and waste money on) less competent people. (And of course it would force companies to hire or reject people based on their gender or skin color, which is rather obnoxious in itself.)

If this becomes widespread, it will have a negative impact in our overall progress.

Companies may get punished for doing nothing wrong

Oculus VR, the company developing the Oculus Rift, has been criticized for not hiring enough women. Their response is quite clear: They have zero problems in hiring women. It's just that there aren't that many female applicants at all. It's hard to hire women that are not applying for the job in the first place. How do you hire people that do not exist?

Imagine that Oculus VR resided in a place where the government imposes fines for companies that do not comply to hiring quotas. What exactly should the company do in this situation? There are no female applicants, so how should they comply with the quota? Are they just supposed to be paying fines for non-existent applicants, or are they supposed to just hire "ghost" employees that do nothing for the company, but who have to be paid the same as everybody else? What sense would this make?

Can be dangerous

As exemplified with the case cited at the beginning of an incompetent firefighter being hired to fulfill a quota, this may in some cases even be dangerous. It can endanger lives if people's safety and well-being is put into incompetent hands. This quote from the article is rather telling:
Some FDNY members are angry. “We’re being asked to go into a fire with someone who isn’t 100 percent qualified,” the source said. “Our job is a team effort. If there’s a weak link in the chain, either civilians or our members can die.”

It increases prejudice

One goal for these "hiring quotas" is to fight prejudice. Ironically, hiring quotas increase prejudice instead.

In companies that have normal hiring practices, you can know that anybody in an important position is there because he or she is competent and has done a lot of work to get there. It can be assumed that they know what they are doing and what they are talking about.

However, if hiring quotas are enforced, people will become more prejudiced against employees that may have got the job due to a quota rather than being competent. What they do and what they say may be deemed more dubious and taken less seriously, and there may even become in-company soft discrimination against them, because the other employees may suspect that they got hired via quota.

It devalues the worth of competent employees

Directly related to the previous, employees who have done a lot of hard work to get into their current position may find said position devalued because of all the other employees hired to fulfill quotas. They may lose motivation when their efforts are not rewarded, and may become the unjust target of assumptions and prejudice. Why should they work so hard, when they could get the same salary and recognition by doing less, like the incompetent quota-hired employees?

Thursday, June 11, 2015

"Oreos"

The monicker "oreo" is used mainly in certain parts of the United States as a derogatory term for black people who, as some see it, behave and live in a very typical "white" fashion. (The metaphor alludes to the expression "black on the outside, white on the inside", like Oreo cookies.)

I'm not exactly sure of this, but it's my understanding that the term probably started mainly as a racist term used by some white people against black people who were atypical of "black culture" (ie. were rich, highly educated, lived in rich neighborhoods among rich white people, etc.) and was later "appropriated" if you will by black people. In this case, however, the appropriation didn't change it to a positive term (like happened with the word "nigger"), but retained its negative connotation.

In other words, the term "oreo" is used by many black people in the United States as a derogatory term for other black people who they see as acting like white people. These are, as said, black people who do not act in a "gansgta thug" way, do not participate in that kind of culture, are often well educated, sometimes rich, may live in the better parts of the city, and so on. In other words, they live the life of an archetypal middle-class (or sometimes rich) white person. The black people who deride them often see them as "traitors" to the "black culture", and having become submissive to the oppressors and assimilated into their culture, or whatever.

So yes, there are many black people in the United States who resent other black people who live, what can be considered, a "normal" life instead of living in the ghetto (usually below the poverty line, and often with poor education) and being part of the black "gangsta" culture.

This is a quite well known fact, and it gets talked about to some extent. However, it's a difficult subject for progressive social justice advocates (or should I say "warriors"). The practice is definitely detrimental because it perpetuates a self-destructive lifestyle that keeps a big part of the population living in poor conditions, with low education, low income, poor living conditions, and with little to no prospect to make a better life. It also perpetuates and reinforces a very divisive and hostile "us vs. them" separationist mentality and is too often ridden with outright racism, with strong prejudice and even discrimination against a group of people based on their skin color (ie. white people in this case).

But the problem from the perspective of a social justice warrior is that these are not the "enemy" (ie. white males). They are the people they are trying to "protect" and fight for. Thus they find themselves in a very difficult and awkward situation where they recognize a pattern of behavior that's highly detrimental, yet they can't speak out against it very loudly, because they are uncomfortable criticizing and attacking the very people they are supposedly trying to "protect". Therefore even if a few such people do comment critically about the phenomenon, it's not a very hotly debated topic, and it's mostly ignored.

This isn't something exclusive to the United States either. From other western countries Sweden has become another bastion of this very mentality. More and more immigrants (even those who aren't exactly "black") are appropriating this same "black gangsta thug" culture, and the exact same kind of derisive attitude towards other immigrants who they see as "betrayers" of their "culture" (ie. those immigrants who have simply integrated into Swedish society and are living a completely normal productive life, like any other Swede).

Of course Sweden being what it is, this is a very taboo subject there. You can only hear it from immigrants who have become the target of such derogatory attitudes, and who dare to speak out.

Needless to say, the Swedish immigrant population is destroying itself from the inside by this kind of separationist attitude. And the Swedish government and media aren't exactly helping.

Friday, June 5, 2015

The newest marketing scam: The batteriser

The batteriser is a small device that can be attached to an AA battery, and which taps into the extra charge left over after the voltage has dropped below the cutoff threshold of whatever device you were using the battery in, thus prolonging its life. It does this by increasing the voltage of the battery back to 1.5V (using a voltage increaser circuit). The site claims that this can extend the battery life to up to 8 times.

To understand the claim, consider this typical battery discharge curve:


As the battery is drained, its voltage decreases. Battery-powered devices have a minimum voltage that they require to operate, which is the cut-off voltage. When the battery charge drops below that voltage, the device can't work anymore.

The website claims that the typical cutoff voltage for most everyday devices is between 1.35 and 1.4 volts per battery. If you look eg. at the blue curve above, you'll notice that after the battery has dropped below that amount, there's still quite a large amount of charge left in the battery, if only it could be used at the required voltage.

This sounds all good and dandy. The problem? The 1.35-1.4 volt cut-off claim is a lie.

Most battery-powered everyday consumer electronics, including things like remote controls, game controllers, wireless mice and keyboards, etc. have a cut-off point between 1.0 and 1.1 volts per battery. This has been repeatedly tested by afficionados and professionals. (And, in fact, these devices are explicitly designed with that cut-off voltage precisely because they maximize the battery life of the typical AA battery, most of which have a discharge curve similar to the above). Even the most power-hungry (or low-quality) devices have a cut-off point of at most 1.2 volts.

If you look at the curve again, and see how long the device will work with a cut-off point of eg. 1 volt, you'll see that it will have used well over 90% of the battery charge before dying. A 1.1 volt cut-off point is able to use somewhat less, but not significantly (perhaps in the 80-90% range.)

So, the "batteriser" may do what it claims to do (ie. increase the voltage of the battery to 1.5V regardless of its charge), but it won't actually extend its life in any significant way. The fact is that the 1.35-1.4 volts figure has been deliberately chosen by the marketing of the device in a deceptive and dishonest way. The figure is incorrect for the vast majority of battery-powered consumer products. (It might be true for some very rare, probably very low-quality products, but it's not true for normal devices.)

There may also be drawbacks in using the device: Granted, the website recommends using the device only after the battery has been drained and the device doesn't work. However, if you were to use it from the get-go, it would actually shorten the life of the battery, not increase it. That's because the device itself consumes some of the energy (it's physically impossible to get a 100% efficiency on any physical electric device.) And if the battery was already going to be used 80-90% (or even more), adding this extra baggage is only going to drain it faster.

More damningly, however, the "batteriser" may actually short-circuit the battery due to its design. The entire jacket of a battery is its positive terminal (and the bottom cap is its negative terminal, insulated from the positive one by a thin insulating ring). The "batteriser" is metallic, and if any part of it makes contact with the jacket of the battery, it might short-circuit it. Short-circuiting a battery can be dangerous (for many reasons, all the way up to causing a fire).

Who is ultimately to blame for a bad movie?

Answer: The director. (With one caveat, explained at the end of this post.)

When you are watching a movie and it turns out to be absolutely horrible, there may be many reasons why. Perhaps the actors are just untalented or doing a piss-poor job at it, like their heart is not in it. Perhaps the script is just bad, full of ridiculously poor dialog, naive plot twists, plot holes and illogical events. Or perhaps the story is just outright boring, or badly written. Perhaps the props, costumes and visual effects are so bad that they ruin the movie (and aren't even of the "so bad it's good" kind).

However, there is one person who always carries the ultimate blame for a bad movie: The director.

The movie director is the person who is responsible to make the movie good. He or she is the person who is ultimately responsible for catching mistakes and fixing them. For noticing bad actors, and fixing (or replacing) them. To make sure that props and effects are good even if the budget of the movie is small. (A small budget is not an impediment in making a great movie. Some of the best and most highly appraised movies in existence have had ridiculously small budgets, even in the few thousands of dollars. It of course requires talent to pull it off, but a good talented director can demonstrably do it.)

In a sense, the director is the "last line of defense" against all things that could make the movie bad. The director must notice when something is wrong, and fix it, rather than letting it slide. You can blame eg. the scriptwriter for writing a bad script, or an actor for being completely untalented or doing a poor run-of-the-mill job, but ultimately it should have been the director who should have caught the problem and fixed it. The director should have required a rewrite of the script, or instructed or even replaced the actor.

I suppose that my point is that the next time you see a bad movie, be aware who you should point your finger primarily at. Look at who the director was.

And that caveat: Sometimes even a great director will release a botched movie, but unwillingly. This happens with meddling executives who want their fingerprints on the movie and go over the director's head and require changes that the director doesn't approve of. In some cases this has been so bad that the director has completely disowned the movie (one of the most common ways of doing that is to use the pseudonym Alan Smithee, which is a codeword for "I disown this movie, I don't want my name attached to it". Although this practice has apparently been discontinued for some reason.)

Is this the last console generation?

Some people have presented the wild hypothesis that the current console generation (ie. the Xbox One and PS4) is likely to be the last game console generation, at least in its "traditional" form. (Most of these people agree that the Nintendo consoles will go on for at least one generation because Nintendo's position and marketing strategies are quite different from those of Sony and Microsoft.)

Several reasons for this have been presented:

Firstly, sales of "traditional" (ie. desktop) consoles are showing signs of waning. The major reason for this is that (by some estimates) about 70% of video game consumers are casual gamers, and there exists nowadays a set of platforms that are somewhat of a console killer from a casual player point of view: Smartphones and touchpads. There is less incentive for casual gamers to buy the latest desktop console given that they can play casual games on their phones. This may mean that in the very near future (if not even right now) sales of desktop consoles will plummet to a fraction of what it has been in the past.

More "hardcore" gamers of course are avid console (and PC) users, and most of them don't find smartphones nor touchpads any kind of enjoyable gaming platform. The problem is that these "hardcore" gamers constitute something like 20% of all video game consumers, which is a pretty small fraction. If the trend of casual gamers moving to smartphones continues, the sales of traditional consoles may drop below what's profitable for console makers.

(This also explains why these people estimate that Nintendo will be fine, because their consoles are still in high demand by more casual gamers. They are pretty much the casual gamer brand. The Wii U was somewhat of a mistake by Nintendo, but such mistakes haven't killed Nintendo in the past either, and it's very likely that they will learn from it with their next console, and start appealing to the casual gamer masses again. Although the next Nintendo console might not be a traditional desktop console, but some kind of Wii+3DS hybrid.)

Secondly, recent consoles have sold not only because of their gaming capabilities, but also because of their multimedia capabilities. For example the PS2, the most sold console in history, was bought by quite many people almost exclusively as a cheap DVD player. Likewise many people bought the PS3 as a cheap Blu-Ray player.

However, this is changing rapidly nowadays. Multimedia in physical discs is waning out very rapidly, as fast internet connections are becoming the norm, and most multimedia services are going online. (Some video rental services have even stopped renting physical discs completely, and gone completely online.)

In other words, there is less and less motivation to buy a console to watch multimedia, as everything is being put online, and is watchable with any touchpad, PC, dedicated streaming box, or whichever machine. (I wouldn't be surprised if TV's start supporting online rental directly, without the need of a separate device.) Consoles are becoming less and less multimedia machines, and more pure gaming machines. (This can be seen in the backlash of Microsoft trying to sell their Xbox One console as a multimedia machine more than a gaming console. It just didn't work. People aren't interested in it for that reason.) And given that gamers who are interested in consoles form a small minority of the marketplace, this means that there will be less and less demand for desktop consoles.

Thirdly, Sony isn't doing all that great as a corporation. They are having some financial problems. "Sony" is not the famous high-quality brand that it used to be decades ago. Their turnover is actually becoming smaller and smaller by the year. If this trend continues, they are likely to merge with something else (perhaps even Microsoft), if not go bankrupt (although that's unlikely; a merger before bankruptcy is enormously more likely.) I do not know the actual figures, but I have the impression that Sony's PS4 branch is actually their most profitable one (with the possible exception of Sony Pictures), and given that the demand for consoles might be decreasing, as explained above...

Fourthly, while Microsoft is doing great (with turnovers that are like an order of magnitude larger than Sony's), some signs can be seen that they are slowly losing interest in the game console market. In some sense their Xbox One project felt a bit like a half-assed one, something that they did more forcefully than because it's actually a highly profitable business. (It is actually quite well known that Microsoft doesn't make much money from actual console sales. They might even lose money overall. Their profits come from game sales, Xbox Live subscription fees, etc.)

Microsoft has had this tendency in the past to try a new market, suck it dry, and then leave. It wouldn't be surprising if game consoles would eventually become another example. It is not in any way inconceivable that they will just stop making consoles after the Xbox One. (At most they might make some kind of multimedia/streaming machine which might use the "Xbox" brand name, but will be so in name only, and which might be able to play PC games, but that's it.)

Some have also speculated that a subtle change in narrative from Microsoft might be another sign that they will be phasing out their desktop console line. You see, in the past Microsoft's E3 and other gaming presentations have hyped their console exclusive games, and either not mentioned at all or mentioned only in passing that "yeah, this game will also have a PC port at some point". It has been like the fact that many of the Xbox games will also be available for the PC is kind of an embarrassing little public secret which they don't like to advertise all that much. This, however, changed quite radically in their 2015 E3 conference, where they were completely open and unashamed about their big games also being ported for the PC. It's like they aren't bothered about that at all anymore. Which might be another sign that this will be their last desktop console and they will be moving away from it in the future. (Although, admittedly, this is 100% speculation.)

This is not, in fact, far-fetched. It may be easy to think "that's just crazy; video game consoles will never die". But consider that at least 90% of game console brands have died in the past, some of which even were highly popular (such as the Sega console line). It's not inconceivable that the same will happen to one or both (or even all three) of the current console lines. If there is not enough demand, they will die, like it or not.

Note that this hypothesis is not envisioning the death of video games. They will most probably do just fine. It's envisioning the death of "traditional" desktop consoles. In a few decades such consoles might become a thing of the past in the same way as many other consumer electronics (like VCR's, CRT TV's, cassette and CD players, etc.) If the next console generation never comes, game developers might keep making games for the current ones for a decade or two, but it will eventually stop (and instead they will move to PC and whatever new gaming technology may be developed in the future).

So, in summary, the reasons why this might be the last generation of "traditional" desktop consoles are:
  • About 70% of video game consumers are casual gamers, and smartphones and touchpads are console killers from their point of view. Only about 20% of consumers are actual console gamers. This may not be enough.
  • Consoles used to be an affordable alternative to multimedia disc players, but the popularization of high-speed internet connections and online services are killing that market.
  • Sony isn't doing all that great overall as a company. In conjunction with the above reasons, it may kill the PlayStation line.
  • Microsoft is showing subtle signs of losing interest in their desktop console line. Their promotions feel a bit lackluster and forced, and their narrative has recently changed. (They used to be quite hush-hush about the fact that many of their console games were also available on the PC. They are now quite open and unashamed about it. This might mean they have a long-term plan of moving back to Windows exclusively, and abandoning the Xbox.)
  • Nintendo will most probably make a 9th generation console, but it may not be a desktop console (instead being some kind of Wii+3DS hybrid.)

Thursday, June 4, 2015

Anti-scientism and pseudointellectualism

Many commenters have noticed and talked about, especially during the past decade or so (and at an increasing rate) how anti-scientism is on the rise in many parts of the western world. These are your typical modern conspiracy theories and denialism (such as climate change denialism, anti-vaccination, anti-evolutionism, all the typical conspiracy theories, and so on) getting more and more widespread acceptance in the general public. In general, science is for some reason more and more distrusted, and its claims attributed to a conspiracy. Many such commenters have been saying how we are, essentially, going back to the dark middle ages.

Nowadays, almost invariably, if someone writes eg. a newspaper article online on this subject, the comment section will be filled with people supporting the denialism and the conspiracy theories, often with a very smug and "knowledgeable" tone.

These anti-science attitudes are often described as superstition. I, however, would describe it as pseudointellectualism. It's precisely that smug tone, the kind of tone that implies "you clearly don't know what you are talking about, you have been duped, clearly you haven't studied these things like I have", that emanates from all those pro-denialism comments that makes me think so.

Conspiracy theories and denialist movements make people feel smart, knowledgeable and intellectual. They give people the sense of being above deception, that they can't be deceived and mislead, that they are too smart for that. They like the feeling of being smarter than the scientists, to see through their "deception", to be above them, rather than subject to them. Many of them also love to be able to use fancy terms and refer to higher notions, like they knew what they are talking about. In other words, they love being intellectually superior to the masses, who they think are being duped into believing falsities.

This is pseudointellectualism. The self-notion of intellectual superiority, based on completely false and fallacious claims and notions, and without any real accurate knowledge, experience or relevant background in the subjects being discussed.

They also project onto themselves the work made by others (ie. the conspiracy theorists and denialists.) Whenever some such person says something like "I have studied this subject", or "I have done the research", or "I have read all the relevant papers", you can be pretty much certain that they have not. Instead, what they have done is see or read someone else's claims, thought "this guy has clearly done a lot of work and research on this subject and knows what he's talking about", accepted those claims, and then projected that perceived "study" and "research" onto themselves, as if they had done it personally, when in fact they haven't.

Quite ironically, all these people who feel being intellectually superior to the masses and above the possibility of being deceived, are in fact being deceived, and they are falling for it hook, line and sinker.

Conspiracy theorists and denialists are, essentially, so-called spin doctors. In other words, they are experts in spinning the details of events and claims to make it appear to the inexperienced as if the exact opposite is true, or that the claims are much more dubious than they really are. They are like magicians, experts at misdirection and making you see things or not see things as they will. They will cherry-pick details and present them out of the larger context in a manner that makes it look like they support a completely different narrative, usually hiding the actual explanation (or, alternatively, "poisoning the well" against it).

These "spin doctors" are seldom able to fool the experts, because the experts obviously know the subject a lot better than that, and have a lot of experience and knowledge about it. However, the average person is not an expert, because that requires years of hard work and study, and that's why they are much more easily fooled. And since believing in these conspiracy theories makes them feel intellectually superior to the masses, makes them feel like they are "in the loop" and above deception, they are often very willing to believe without question.

While being naive and deceived, which is the big irony here. It's the age-old con-artistry: Make the mark believe they are in control and making all the decisions, like they are above everything else, when in fact they are just being manipulated by the con-artist.

Giving certain video games a chance

I have written in some previous blog posts how I hate leaving video games unfinished, and how doing so is a relatively rare occurrence. Sometimes I wade through a game to the end even if it feels more like a chore than a joy. (While it's a very fair question how much sense that makes, I still like to do it, rather than leave it unfinished. Having finished a game, rather than leaving it permanently in an unfinished state, is a kind of mental reward in itself.)

In a few cases this principle actually pays off. Some games may feel quite boring and tiresome at first, during the first hours of gameplay, and one easily gets the urge to stop playing them. However, when you keep playing, you might eventually find out that it's actually quite a good and enjoyable game.

Most recently I experienced this with Dragon Age Inquisition. The first about 5-10 hours of gameplay felt like a chore, with interest in the game lessening by the hour. Maybe it was the lack of interesting quests, a seeming lack of focus, and it not being clear where to go next or how the storyline should be progressed, how the war room map works... The quite random difficulty spikes in enemy strength, and the somewhat unclear combat system, didn't exactly help either. I was, in fact, considering stopping playing, and leave the game unfinished. It was that bad.

But I pressed on, out of stubbornness, and in this particular case it actually paid off. When I started getting accustomed to the combat system (and to the controls, as each console game seems to control just slightly differently), how the map and quests work, and how the storyline is made to progress, it has actually become enjoyable once again. Perhaps even a bit addictive.

In a sense, I gave the game a chance, and it worked. It's actually a pretty decent and enjoyable game after all, when you get used to it and learn the basics, ie. after you get over that initial learning curve.

This isn't the first time either. I have had this very experience with other games as well, sometimes even games I did leave unfinished, but gave a second chance years later.

This made me wonder: How often do gamers "give up" too soon, and leave games unfinished, even though if they just gave them a chance they might find them enjoyable after all?

I have an acquaintance who I think is a bit like that. In other words, he gives up on video games relatively easily. Several times I have witnessed how he has re-sold a game very soon after buying it (too soon to have been able to play it through). Games that I thought were pretty good.

Of course the dark side of this is that with some games they don't become better even if you press on. It's hard to predict.

Tuesday, June 2, 2015

Why you shouldn't enroll in gender studies

Many western universities have so-called "gender studies" departments and curricula. Here are some reasons why you shouldn't enroll in those courses:

It's a complete waste of time and money

Unless you happen to live in one of the very few countries were University is free or very cheap, studying at a University is extremely expensive. We are easily talking about tens of thousands of dollars (or your regional equivalent).

University is also a very unique and extremely important part of your life, if you get to enroll in one. It can be the most important period of life where you learn an actual profession, which you will probably be working on for the rest of your life. Universities are usually the places with the highest education standards and content, which allow you to reach the most prestigious and most useful careers.

Wasting this unique opportunity, and all that enormous amount of money, is one of the stupidest things you could ever do in your life. You should take this opportunity to study something useful, such as one of the STEM fields (Science, Technology, Engineering, Mathematics). These are the fields that have an actual career prospect. Gender studies do not. Gender studies is completely useless in this regard, and have very poor career prospects. You are most likely to end up in a completely unrelated job, and all your years of study and all that money will have gone completely to waste.

(Ironically, gender studies majors often complain how there are so few women in STEM fields. While they themselves chose gender studies over those fields. Think about that for a moment.)

It makes you into a bigoted, angry zealot

This is not something I am saying. This is something that feminists who have enrolled in gender studies at University are saying.

Many of these students were very rational, reasonable and normal before enrolling, and became angry, bigoted and incredibly paranoid afterwards. Gender studies courses is essentially indoctrination and brainwashing. There are quite many testimonies from feminists who say things like "I didn't even know how much sexism, misogyny and racism there is all around me before these courses", and "I started seeing sexism, oppression and racism everywhere", and "every time I went to class, I became angrier and angrier".

And mind you, those are not confessions of ex-feminists who "deconverted" and saw the error in their ways, and are telling their stories as a warning to others. No, those are testimonies of feminists who still think like that and are saying those things as if the were a good thing. That's the level of indoctrination we are talking here.

It makes you paranoid, antisocial, and unhappy

While closely related to the previous, it deserves its own attention.

As mentioned, gender studies classes tend to make students paranoid. People who were completely normal, had normal attitudes and views of society, and were very sociable and nice, become very paranoid and angry. They start seeing demons everywhere, in the form of rampant sexism, oppression, misogyny, racism, and so on. Even minor and completely inconsequential things that are not related in any way, suddenly become symptoms of the deeply-rooted oppression and sexism in our society. You are also very likely to become a racist misandrist who hates white men (even if you are a white man yourself!) You may try to convince yourself that you are not, but you will.

Recently "converted" feminist social justice warriors are extremely antisocial. They can't stop talking about their paranoia, and they are extremely irritating to be around. They will anger and drive away people close to them. They will most likely end up hanging with only likeminded social justice warriors, which ends up forming a cult-like environment.

Social justice also makes people unhappy, even from eg. hobbies that normal people enjoy. For example, numerous feminists have written articles about how they feel guilty, dirty and bad people because they realized that they actually enjoyed eg. a video game which, on second thought, they then found "offensive", or "sexist", or "oppressive", or whatever buzzword. Their feminism kills their own enjoyment, and makes them feel guilty and unhappy.

There is no joy in the life of a modern feminist social justice warrior. There is only anger and unhappiness. Like in that one South Park episode, in their eyes everything becomes excrement, and they can't enjoy anything.

It causes fundamental human rights problems

I have been recently writing blog posts about how universities are becoming the antithesis of academic freedom and free speech (such as here and here). The social justice warriors in many universities are aggressive, and they trample all over the most fundamental principles of freedom, sometimes even breaking the law in the process. We have all kinds of astonishing extremism, all the way from people advocating explicitly ending academic freedom, to people banning white men from attending events or enrolling for leadership positions, harassment of falsely accused people, and amazingly racist comments. We even have feminists attacking other feminists (read the whole story, it's quite telling.)

We are reaching a point where being a white man in a University is not safe anymore. You are treated like a second-class citizen, like a sub-human. You have less rights than everybody else, and if somebody eg. accuses you of being a rapist, your life is pretty much over, no matter how innocent you are. You will be harassed, you will be put into black lists, and even your health and safety may be at risk. Even if you aren't accused of anything, we are going more and more into a direction where you will have less rights than other people, and you will be the subject of insults and defamation, for the sole reason that you happen to have the wrong skin color and gender, and you'll simply have to endure it.

None this nonsense comes from the STEM departments or their students, or from some outside party. No, all of it comes directly from the gender studies departments and their students. They are directly responsible for all of this. It's these departments who are causing the end of academic freedom, and who are producing a generation of sexist, racist bigots.

If you are a good person, please do not enroll in gender studies, or any similar courses. Please remain a good person, for your own sake, and the sake of everybody else. Do not waste your life. Enroll in a STEM field; you'll be much happier and much more productive, and you will be actually helping our society, rather than destroying it.