Thursday, July 27, 2017

Postmodernism

In order to understand Postmodernism, we first have to understand what Modernism is.

Modernism is a kind of umbrella term used to describe the quite radical changes in sociopolitical, philosophical and scientific concepts in the western world, which started in the 18th century, and which then eventually pretty much expanded to much of the rest of the world.

The philosophical and scientific aspect of this zeitgeist started with the so-called Age of Enlightenment in the 17th and 18th centuries. Prior to this time, religion, philosophy and science were all considered pretty much parts of the same thing, tightly tied to each other. The approach to science was highly presuppositionalist (heavily driven by the religious and philosophical ideas of the time, with any scientific theories being controlled by these presuppositions).

The Age of Enlightenment, however, caused a massive change in the philosophy of science. It was the time when the modern scientific method was developed. Religious and philosophical presuppositionalism was discarded, and replaced with evidence-based and fact-based methodology. (In other words, the direction of scientific research was effectively reversed: Rather than assume a conclusion and trying to corroborate that conclusion with research and testing, instead the research and testing is done first, and then the conclusion follows from that, without prejudice and presupposition.) This philosophy of science developed modern scientific skepticism: No claim is accepted without sufficient valid evidence, and all claims must be based on facts and actual measurable and repeatedly testable and falsifiable evidence. This was in drastic contrast with prior centuries, where religious presuppositionalism was rampant, and scientific rigor was almost non-existent.

This caused an inevitable and pretty much total separation between science and religion, and, also, pretty much effectively a split in philosophy. Where previously religion, philosophy and science were considered just aspects of the same thing, now they were completely separate.

On the sociopolitical side Modernism refers to the ever increasing animosity within the general public against royalty, nobility, and overall any form of governance that was inherited, elitistic, aristocratic, and oligarchic. The culmination of this unrest was the French Revolution at the end of the 18th century, where the rule of the absolute monarchy in France was replaced by a republican government.

This revolution was extremely significant in world history because it triggered the global decline of absolute monarchies in the west, and eventually pretty much the entirety of the world, replacing them with democratic governments, where the government is elected by the people, from among the people, rather than it being owned by an oligarchy by birthright. While royalty and nobility still exist in many countries even today, they (especially the latter) do not hold any significant power, and are mostly nominal and customary.

On a perhaps more abstract level, some of the fundamental characteristics of Modernism are objectivity, (scientific) skepticism, equality, and meritocracy. The world and human society is judged by facts and hard testable evidence, morality and legislation is sought to be as objective as possible, people are treated as equally as possible, and societal success ought to be based on personal work and merit rather than birthright and class. This is the era of science, technology, universal human rights, democracy, and the modern judicial system.

Postmodernism, on the other hand, is a much fuzzier and harder to define concept. And, in my opinion, almost completely insane.

One of the driving ideas behind postmodernism is the concept of subjective truth, and to a degree, a rejection of objectivity and science. Postmodernism is often summarized as "all truth is relative".

The notion of subjective reality in postmodernism can range from the absolutely insane (and therefore innocuous, as it has no danger of affecting society) to the more down-to-earth mundane things (which, conversely, can be a lot more dangerous in this sense).

The most extreme (and therefore most innocuous) form of this is the notion that some people have that the universe, the actual reality we live in, is personal and subjective. We make our own reality. If we think hard enough about something happening, it has a higher chance of happening, because we shape our own reality, our own existence, with our minds. Evidence-based science is rejected as antiquated and closed-minded.

That kind of philosophy is not very scary because nobody takes it seriously. Especially not anybody with any sort of power to impose it onto others and, for example, create legislation based on it.

However, there are other forms of postmodernist notions that are much more dangerous and virulent. Perhaps no other example is better and more prominent than the postmodernist idea of human gender.

In modernism, gender is defined by what can be measured and tested. It's a cold and hard scientific fact. We can take samples and put machines to inspect them, and see what the samples consist of. We can observe and measure the human body, its biology and its functionality. Everything can be rigorously observed, measured and tested.

In postmodernism, however, gender is defined by the subjective feelings of the person. Not only is the gender of the person not measurable by any machine, moreover there aren't even two genders, but however many each person wants there to be. People can freely make up new genders for themselves as they see fit and feel like. It doesn't matter what the test tubes and machines say, all that matters is what the person says and feels.

The very concept of "gender" cannot be scientifically stated, based on measurements, facts and testable evidence. Science is completely rejected in this (unless it is twisted for political purposes to support the notion).

Unlike the first extreme example of postmodernism earlier, this one is much scarier because it has much more influence in the actual world. It has much more influence, in a much, much wider scale, on how people behave and, perhaps most importantly, what kind of legislation is enacted, and how eg. schoolchildren are taught and treated. Nobody in their right mind would demand that schoolchildren be taught that the universe itself is shaped by whatever we think and want. However, schoolchildren at many places are already been taught that they can create their own genders as they wish, and to completely disregard science on this matter.

The really scary thing about it is how virulent the idea is. School after school, university after university, and government after government is embracing this form of postmodernism, and some countries are already enacting laws to enforce it. And there seems to be nothing to stop the insanity from spreading.

As mentioned, the concept of gender is but just one egregious example of postmodernism. There are many others. And they are getting more and more a hold in our society, undermining factual objective science.

Friday, July 21, 2017

Nintendo's biggest mistake: The PlayStation

Surprisingly few people know the history of the Sony PlayStation line of consoles. This might not be news to tech-savvy console aficionados, but it's nevertheless quite little known.

The fact is, Nintendo effectively created the Sony PlayStation.

Or, more precisely, caused it to be created.

You see, back when the SNES was at the end of its lifespan in the early 1990's, its greatest competitor, the Sega Genesis, had a CD peripheral (which could contain an entire CD worth of a video game, including CD quality sound and some primitive video footage.)

So in order to compete with it, Nintendo wanted to also create a CD peripheral for the SNES. So Nintendo partnered with Sony to create such a thing. The tentative name for this peripheral was, and I kid you not, PlayStation.

However, the corporations got into some kind of argument, and they dissolved the partnership. But rather than just forget about it, Sony decided to create their own console. The Sony PlayStation. Which was, unsurprisingly, CD based. (How they got to keep the name, I have no idea. Maybe some kind of deal between the corporations.)

The rest is, as they say, history. The PlayStation came to be one of the most successful consoles of the era, and its successor, the PlayStation 2, became the best-selling console in history, so far. The PS3 and PS4 are not far behind either.

I wonder if Nintendo is kicking themselves because of this.

(Although, in retrospect, this might have been a blessing for gamers. Nintendo consoles are not exactly known as the platforms for badass games for hard-core gamers. They used to be, back in the SNES era, but not in a long time.)

Sunday, July 16, 2017

What is falsiability in science?

Many people think that science works by formulating a hypothesis (based on observation and measurement) about a particular natural phenomenon, and then trying to prove that hypothesis correct. While that might sound very reasonable at first glance, it's actually a very naive and even incorrect approach. It's an incorrect approach because it may lead to the wrong conclusions because of confirmation bias.

Rather than trying to prove the hypothesis, the better method is, as contradictory as it might sound at first, try to disprove it. In other words, don't construct tests that simply confirm the hypothesis; instead, construct tests that, if successful, will disprove the hypothesis, show that it's wrong.

And "trying to disprove the hypothesis" is not always as straightforward as "if the test fails, it disproves the hypothesis". In many cases the hypothesis must be falsifiable even if the test succeeds.

An example of this is controlled testing. It might not be immediately apparent, but the "controls" in a "controlled test" are, in fact, there to try to disprove the hypothesis being tested, even if the actual test turns out to be positive (ie. apparently proving the hypothesis correct).

A "control" in a test is an element or scenario for which the test is not being applied, to see that there isn't something else affecting the situation. For example, if what's being tested is the efficacy of a medication, the "control group" is a group of test subject for which something inert is being given instead of the medication. (In this particular scenario this tests, among other things, that the placebo effect plays no significant role.)

If the medication were tested without a control group, a positive result (ie. the medication apparently remedies the ailment) would be unreliable. It might look like it's supporting the hypothesis, but it doesn't take into account that there might be an external factor, something else (eg. the placebo effect), that caused the positive result instead of the medication.

It's very important that hypotheses can be proven wrong in the first place. It's very important for it to be possible to construct a test that, if positive, actually disproves the hypothesis (or, at the very least, a test that if negative, likewise disproves it.)

That is the principle of falsifiability. The worst kind of hypothesis is one that can't be proven wrong, ie. when there is no test that would show it to be incorrect.

For example, if somebody believes in ghosts and spirits, ask them if there is any test or experiment that could be constructed that would prove that they don't actually exist. I doubt they could come up with anything. The same is true for psychics, mediums and the myriads of other such things. They will never come up with a test or experiment that, if positive, they would accept as definitive proof that those things are not real. (Any results of any experiments on these subjects will be dismissed with hand-waving, like the psychic not feeling well, or whatever.)

The hypothesis that ghosts exist is pretty much unfalsifiable. While people can come up with experiments that, if positive, would "prove" their existence, not many can come up with an experiment that would disprove it. And that's a big problem. The "positive" experiment results are not reliable because, like with uncontrolled medical tests, they don't account for other reasons for the observed results.

That's why it's more important to be able to prove a hypothesis wrong than right. If numerous negative experiments (ie. ones that if successful would prove the hypothesis wrong) fail, that will give credibility to the hypothesis. But if no such experiments are possible, then the hypothesis becomes pretty much useless.

Tuesday, July 11, 2017

Bill Nye is a liar

Bill Nye is a somewhat famous "science communicator". Meaning that while not a professional scientist per se, he helps popularize and inform the public about scientific matters. He is most famous for his 1990's TV series "Bill Nye the Science Guy".

For some reason in later years he has become quite badly "blue-pilled" (ie. an advocate of modern feminist social justice ideology). In the absolutely infamous 9th episode of his newest show, "Bill Nye Saves the World", he advocates for "gender fluidity", and how there are billions of genders and sexes and whatnot. The episode is an absolute cringefest (and I'm not just saying that; it really is. You have to see it for yourself.)

Many people have criticized it for, among other things, dishonesty. For example at one point in the episode he says:
"These are human chromosomes. They contain all the genes you need to make a person. This one is called an X chromosome, and that one down there, that's a Y chromosome. They are sex chromosomes. Females usually have two X's and males generally have an X and a Y. But it turns out about one in 400 pregnancies has a different amount of sex chromosomes. Some people only have one sex chromosome. Some people have three, four or even five sex chromosomes. For me that sounds like a lot. But using science we know that sex and every aspect of human sexuality is.. well, it's a little complicated."

Bill Nye is implying here that the difference between the sexes is somehow fuzzy, and that there may be multitudes of different sexes. What he is doing here is lying by omission.

He is making it sound like having an unusual number of sex chromosomes is somehow normal, and that there's absolutely nothing special or wrong (biologically speaking) about it, other than the other combinations of chromosomes being a bit less common. While he doesn't outright say it, he seems to be implying that from the everyday people you encounter out there, just normal people, just like anybody else, some of them may have a different number of sex chromosomes, and you couldn't tell the difference (other than, I suppose, that they might be more effeminate or more masculine than expected, or be in some other ways of ambiguous gender.)

What he is quite explicitly not telling is that having a different amount of sex chromosomes from the normal is actually a congenital defect, a birth defect, often with health and/or developmental consequences.

While some people with an unusual number of sex chromosomes may well turn out to be completely normal and healthy, and never even realize there's something unusual about them, the most common side-effects of this are infertility, stunted mental development (such as learning disabilities), stunted growth and lower life expectancy. And those are just the mildest side effects. Severe developmental deficiencies, and significantly heightened risk of all kinds of diseases (such as cardiovascular diseases) are also common. The list of possible symptoms is really extensive. And the more the number of sex chromosomes varies from the normal, the more common and severe the symptoms are. The more severe the variation, the less likely it is for the person to even survive to adulthood.

In fact, the vast majority of pregnancies with sex chromosome disorders end in miscarriage or stillbirth.

But Bill Nye doesn't convey any of this to the viewer. Instead, he gives the impression that people with an abnormal number of sex chromosomes are just normal healthy everyday people that you meet every day, and wouldn't even recognize outwardly as being such.

Bill Nye's hypocrisy is also heavily criticized because in his original "Bill Nye the Science Guy" TV series there was an episode dealing with genders and sex chromosomes, which stated, clearly and repeatedly, that there are only two sexes, period. This segment was completely cut out from the Netflix re-release of the series.

Monday, July 10, 2017

New Nintendo 2DS XL and Nintendo's marketing strategy

When Nintendo released their previous-gen console, the Wii U, they botched their marketing strategy almost catastrophically. The Wii U was, indeed, a completely new "next-gen" console in the Nintendo line, ie. in the same "console generation", ie. the 7th, as the PS4 and the Xbox One. It was not just a slightly fancier version of their previous-generation console, the Wii (which competed with the PS3 and the Xbox 360).

Nintendo botched the marketing because they didn't make it clear enough to the wider public that yes, this was indeed an entirely new console, a "next-gen" console, not just a slightly upgraded Wii. This has been estimated to be one of the reasons for the relative commercial failure of the Wii U. People were simply confused, as they thought that it was just some kind of Wii with an extra touch-based controller, or something. Many casual non-tech-savvy Wii owners didn't see the incentive of buying (what they perceived as) just another version of the same console.

Nintendo, perhaps having learned their lesson, did significantly better with the marketing of their next console, the Nintendo Switch. Massive advertisement campaigns quite cleverly made it quite clear that this is, indeed, an entire new console, the next "big one" from Nintendo. The real replacement for the old Wii.

Their marketing was so successful, in fact, that as far as I understand, the Switch broke the record of the fastest-selling console in its first week/month in history. If I remember correctly, the million units sold landmark was reached in just a few days, which is faster than any other console in history, including the PS4.

But regardless of this incredibly successful marketing campaign, it appears that Nintendo might be falling into their old habits.

The Switch was originally intended to be a merging of Nintendo's two major console lines, ie. the desktop consoles and the handheld consoles. The Switch was supposed to be, and is, a hybrid of the two, and can work as both, and thus ought to work as the next-gen replacement for both.

What that should mean, in turn, is that Nintendo's, and all third-party developers, focus ought to be concentrated on the Switch, with the Wii/Wii U and the 3DS being slowly phased out as the obsolete "last-gen" console pair. The Switch is now the next-gen console from Nintendo, for which all new games, in increasing numbers, will be made, handheld or otherwise. This was what many early Switch buyers were (and are) expecting.

But now Nintendo seems to be giving mixed signals about this, after all.

Some months ago there was a rumor that some Nintendo executive may have given hints that this might not, after all, be the end of the handheld 3DS line, and that there might be a "next-gen" version eventually, in parallel with the Switch. Of course this was just a rumor, and I don't know how reliable it was, nor have I heard of it since. Only time will tell.

Anyway, rumors aside, Nintendo just recently published a new version of the 3DS: The New Nintendo 2DS XL (that's a mouthful). This is essentially a New 3DS XL (which is a version of the New 3DS with larger screens, which in itself is an upgraded version of the 3DS) with a slimmer design and without the stereoscopic 3D effect. (The major advantage of it is, quite obviously, a somewhat cheaper price, compared to the New 3DS XL.)

In other words, a bit over three months after they published the Switch, they now published another version of the 3DS. This seems to signal that Nintendo is still intending to support the system for at least a few years to come, rather than having it end its natural lifespan as people move to the Switch.

Many critics, and Switch owners, are worried that this means that Nintendo is not, after all, dedicating all of their time, resources and effort on Switch development, but that it will still be shared with the 3DS line. It also might signal to 3rd-party developers to do the same.

Couple this with criticism from many major game developer companies that the Switch is not a very good platform to develop big modern triple-A titles for (because it's much less powerful than anticipated), and it only strengthens the reasons for the worries. Millions of people bought the Switch because they anticipated it being the next big thing. Will it, however, turn into just another Wii U in terms of a library of games and overall support?

Nintendo is giving very mixed signals here. I don't think they should, at this point.

Saturday, July 8, 2017

In defense of the "waterfall model" of software development

Software development processes are higher-level ideas and principles on how to develop a piece of software (or any system based primarily on computer software) for a required task. For very small projects it may be enough to just have a need, and start coding a solution for it. However, for even slightly larger projects this becomes infeasible very quickly, especially when many people are involved in the project. (When more than one person is involved, it immediately introduces management problems, so that every participant knows what to do and when, etc.)

The so-called "waterfall model" is one of the oldest such development models ever devised, going as far back as the 1950's. While there are many versions of this model, differing in details and number of steps, the distinguishing characteristic of the model is that it consists, essentially, of big sequential stages, which are usually followed in strict order (ie. the next stage isn't started until the previous one has been finished.)

A typical simplistic example of such stages could be "requirements" (figuring out and documenting what exactly the software needs to do), "design" (planning out how the software should be implemented), "implementation" (actually writing the software), "testing", and "maintenance". Part of testing is, of course, fixing all the bugs that are found (so it partially loops back to the "implementation" step, sometimes even to the "requirements" step, if it turns out that some of the original requirements are impossible, contradictory, or infeasible.)

For decades the waterfall model has been generally considered antiquated, simplistic, rigid, and above all, inefficient. Countless "better" models have been devised and proposed over the decades, most of which promise more efficient development with higher-quality outcomes (ie. less bugs, faster development, and so on.) If you ask any expert software engineer, they will invariably dismiss it as completely obsolete.

As a long-time professional small-business software developer, however, I would argue that perhaps the bad reputation of the waterfall model may be undeserved, and that it could, in fact, be the best model in many projects, especially smaller ones (somewhere in the ballpark of 20-200 thousand lines of code, a few months of development time.)

The absolutely great thing about the waterfall model is that the requirements are figured out and documented in full, or almost full, before writing the software even starts. While perhaps not written in stone and never changed again during development, it should at least lock the vast majority of the features, preferably to the tiniest of detail.

The great thing, as a programmer, about having such a complete and detailed requirements document is that once you start implementing the program, you have a very clear, complete and unambiguous picture of what needs to be done, and you can design the program from the get-go to accommodate all those requirements and features. You can plan, design and implement your module and class hierarchy, and your modules and classes, to fluently and cleanly support all the required features right from the start. When well done, this will lead to nice, clean and logical class hierarchies and class interfaces, and to a more robust and understandable program design overall.

Also very importantly, having an almost-complete document of requirements, which means that you know exactly what needs to be done, means that the implementation of the program will be relatively straightforward and fast. Usually the actual implementation does not take all that much time, when everything that has to be done is clear from the get-go.

If such an almost-complete requirements stage and document is not made, however, it easily leads to, essentially, the software development equivalent of "cowboy programming" and "spaghetti code". It will also almost inevitably lead to actual "spaghetti code" in the program implementation.

In other words, if the project starts with just a vague idea of what should be done, and the concepts for the project evolve during its implementation, with new ideas and features being conceived as the project progresses, this leads almost inevitably to absolutely horrendous code, no matter how well you try to design it from the start.

What's worse, the implementation will take a very long time. Existing code will constantly need addition, changes and refactoring. Existing code will often need to be redesigned to accommodate new requirements (which were impossible to predict at the beginning).

This can turn nightmarish really quickly. Sometimes even a simple-sounding new feature, which might sound like it could be implemented in minutes, might take several hours to implement, just because the existing code was not prepared to support that feature.

This kind of software development is far from fun. In fact, it can be absolutely horrendous. And horrendously inefficient. New requirements and new ideas keep pouring in at a semi-regular basis. Some of them take some minutes to implement, others can take several hours. Many of these ideas are just testing to see if they will work, and may be dropped later, after it's decided that the idea didn't work well after all. Essentially, the software implementation is used as a testbed for new ideas, to try to see if they will work; and if they don't, they are just discarded. This ends up in countless hours of wasted development time.

And of course as a result of all this, the program becomes a nightmare of an absolute mess. No matter how much you try to keep it clean and high-quality, there's no avoiding it, as hundreds and hundreds of new and changed features are patched into it, sometimes haphazardly because of necessity.

When one is involved in such a project, one really starts to yearn for a waterfall model requirements documentation, which would make implementing the program so much easier and faster.

Personally, I would change these new poorly designed, poorly enacted "modern" software development models for a good old waterfall model any day, if it means that I would have a clear and complete picture of what needs to be done right from the get-go, with little to no changes made during or after development. It would make it so much easier and faster, and the end result would be of much higher quality. The whole project would also probably take a lot less time.

Tuesday, July 4, 2017

The most over-hyped movie in history

Public and/or marketing hype for a work of art is definitely a lot more common with video games, but movies also get their share from time to time, especially if it's a new movie for a popular franchise (and especially if it really is new, as in, the first one made for the franchise in a very long time).

What is the movie that was the most hyped, in the entirety of movie history? There are, of course, many candidates, but I would propose the Star Wars Episode 1: The Phantom Menace wins in that category.

The original Star Wars trilogy is, for one reason or another, one of the most influential set of movies in recent popular culture. Very few other movies or franchises parallel its success and pervasiveness. In the 80's and largely in the 90's Star Wars was everywhere, and everybody knew what it was. It was almost impossible not to. And the amount of fans was staggering.

However, the third movie in the trilogy, Return of the Jedi, was released in 1983. Since then a myriad of spinoff movies, TV series, comic books and so on were made, but nothing that continued the actual movie canon.

When it was announced that a new movie in the main canon franchise would be released in 1999, after a hiatus of a whopping 16 years, fans went absolutely crazy.

Perhaps no other sign of this is clearer than the fan reaction to the trailer of the upcoming movie. The trailer itself is, it must be said, a work of art in itself. It's pretty awesome all in itself, but especially so in 1999, from the perspective of the starving fans.

The trailer was, in fact, so popular that, and I kid you not, many people were buying movie tickets to other movies just to see the Phantom Menace trailer at the beginning, and then leaving after the trailer was over. (1999 was still by far pre-YouTube time, and even a time when the majority of people didn't even have an internet connection at all, much less one that allowed downloading huge video files, so the vast majority of people had no way to see the trailer anywhere else than at a movie theater.)

Fans camped outside of some movie theaters literally for weeks prior to the premiere of the movie, and these camping tent lines were astonishingly long. (While doing this was not unprecedented, I wouldn't be surprised if this was the largest such an event in movie history, in terms of the number of people in these lines, and how long they were there.) The first time the movie theater doors opened on the day when they started selling tickets for the movie was a spectacle in itself, and got news coverage (which in itself is quite rare).

Of course the movie itself turned out to be... somewhat mediocre in the end, and the reception to be lukewarm at best. As one critic put it years later, "it looks like Star Wars, it sounds like Star Wars... but it doesn't feel like Star Wars."

The reception was, of course, a bit more positive among the die-hard fans themselves, at least at first. Similar queuing lines happened at the premieres of the two other movies in the new trilogy, but they weren't even nearly as massive. (They still were quite massive, especially at the premiere of the second movie, but not as much.) I think that there was a kind of mentality where the die-hard fans were hoping that the two subsequent movies would be much better, and were at some level in denial about how mediocre the first movie was (perhaps because they didn't want to admit even to themselves how overly hyped they got for a movie that was somewhat of a disappointment in the end.)

Years later I don't think even many fans are considering the trilogy in general, and especially the first movie in particular, to be all that good. Quite a disappointment in the end.

But the pre-release hype surrounding the movie was, in my view, unprecedented, and so far unparalleled.

Why is HDMI 1.4 so common in 4k displays?

4k displays (ie. 3840x2160 resolution) are all the rage nowadays. More and more display manufacturers are making their own 4k products.

There is one thing that I have noticed about many of them, however: Many, even most, of these displays are using HDMI 1.4, rather than HDMI 2.0. Which makes little sense.

The major difference between the two versions is bandwidth. HDMI 1.4 does not have enough bandwidth to display 4k video at 60 Hz (in uncompressed RGB format). It only has enough bandwidth to do so at 30 Hz. HDMI 2.0, on the other hand, has the required bandwidth for 4k@60Hz.

It's never a question of the display itself being incapable of displaying 4k content at 60 Hz, as invariably they support this through their DisplayPort connection. It's only the HDMI connection that limits the refresh rate to 30 Hz.

Some 4k displays do support HDMI 2.0, but for some reason they seem to be a minority at this moment.

This is problematic for several reasons. Firstly, it forces PC users to use DisplayPort rather than HDMI. Ok, perhaps not such a big deal.

But secondly, and more importantly, both the PS4 Pro and the Xbox One X have only an HDMI 2.0 output port. They do not have DisplayPort support. This means that you can't use one of these HDMI 1.4 displays with them, if you want 60 Hz in RGB mode. (At least the PS4 Pro supports 4k@60Hz with HDMI 1.4, but only in YUV420 mode, which has reduced lossy colors, making colors less vibrant and with artifacts.)

I can't really understand why monitor manufacturers are doing this. Sure, they probably have to pay more in order to use HDMI 2.0 (AFAIK it's not free to use), but I doubt it's that much more.

Moreover, many manufacturers are outright hiding which version of HDMI their monitor is using. Many of them only list "HDMI" as supported input, without specifying the version number, anywhere. If you want to be sure and find out, your only recourse is to try to find some third-party review that mentions this.

Although, at this point, it's probably safe to assume that if the monitor manufacturer is not telling which version of HDMI they are using, it's probably 1.4.

Saturday, July 1, 2017

Gender discrimination in Australian Public Service hiring?

The Australian Public Service is a branch of the Australian government that provides services to almost every part of Australian life. 

In 2016, women comprised 59.0% of the APS as a whole, but accounted for only 42.9% of its Senior Executive Service officers. Is this clearly a case of gender bias (deliberate or unconscious) in hiring?

A governmental study sought to find out, by testing with applications and CVs that had no identification of the gender or any other characteristic of the applicant.

The results were surprising. There was indeed bias when applicants were identifiable. But in the other direction. As in, women were more likely to be shortlisted (ie. accepted for the next step in the hiring process) than men. Not by a lot, but measurably so (2.9% more, according to the study). Even moreover, and perhaps more surprisingly, male reviewers were more likely to shortlist female applicants than female reviewers.

Of course this meant that when the reviewers did not know the gender or any other personal characteristics of the applicants, ie. when this information was omitted from the CVs, and thus the reviewers could not show any favoritism or bias, women actually became less likely to be shortlisted and more likely to be rejected.

This, of course, means that the feminist theory that women being a minority in managerial positions as being caused by misogynist bias, is at least in this case completely false. On the contrary, there is already bias in favor of women, rather than against them. Yet they still form a minority in the top positions.

I find the conclusion of the study interesting:
"Overall, the results indicate the need for caution when moving towards ’blind’ recruitment processes in the Australian Public Service, as de-identification may frustrate efforts aimed at promoting diversity"
I don't think there could be a more direct way of saying that "we need to deliberately favor women in hiring over men, if we want to promote diversity".

Sunday, June 25, 2017

An updated perspective on VR (after getting a PSVR)

If you have been reading this blog you might know how much I have ranted about my disappointment about VR, and why I fear it might turn out to be a complete commercial flop and a niche technology (with the same ultimate fate as the Kinect and the PS Vita). That's not to say I didn't want a VR headset. Of course I have always wanted one. I have been wanting one for about 4 years now, and that desire has never diminished. The major reason why I haven't purchased one has been the exorbitant price (and the abysmal library of triple-A games.)

Recently I finally got myself a PSVR, because there was a decent deal in an online store here. I commented about it in my previous post.

I have now played several games with it, including Farpoint, Robinson: The Journey, Until Dawn: Rush of Blood, and Tumble VR (besides, of course, a dozen or so free demos and apps.)

In my past posts I have made several claims about VR, without actually having experienced it myself. Now I have. So, has this confirmed these claims? Was I mistaken? Was it as bad as I predicted?

"Room-scale VR"


Especially the HTC Vive has been always marketed pretty much solely for "room-scale VR". In other words, you play games standing up. I always predicted that this is a completely untenable form of gameplay. Nobody plays games standing up; not for long periods of time. 15 minutes maybe, but not much longer.

It is perfectly possible to play games standing up with the PSVR, and it works ok with many games (and a few even expect it, such as Job Simulator). The range within which you can walk around is not as large as with the HTC Vive, and turning your back to the camera will cause tracking problems for the controller(s) (because the system needs to see the controller(s) for proper tracking, else it's only a best-guess based on its accelerometers). But with those caveats there's nothing stopping you from playing standing up, and many games are completely playable that way. You can look completely around you, and even move a bit (within the area visible to the camera, which at a normal distance means something like 1-2 meters of movement range).

When I started playing the game Farpoint (which uses the Aim Controller), I decided to try it standing up, for fuller immersion. It works pretty well like that.

But it turns out I was completely right: After something like a half hour of gameplay my feet and my lower back were hurting. I couldn't play for much longer.

I'm in a relatively good physical shape, mind you. For instance, every time I go to work, I jog the stairs up to our office, on the 6th floor, without much problems. I still couldn't play the game for more than about 30 minutes standing up. It was a sit-down game for me from that point forward (and it's completely playable like that).

Room-scale VR? Thanks, but no thanks. It just doesn't work. It might make for fancy tech demos to awe people for 10 minutes, but that's it. It's not a feasible way to actually play games on the long run, especially not for the average gamer (who, let's face it, is not exactly an athlete).

Move/Aim controllers


Some/most PSVR games can be played with the standard DualShock 4 controller (which supports tracking thanks to its light bar and accelerometers). However, the tracking tends to be relatively poor, and very often the tracking will veer off (by making the in-game stand-in for the controller slowly turn to one side over time) and require readjusting every couple of minutes or so. With games that require a tracked controller, and support the DualShock 4, it's ok but far from perfect.

The Move and Aim controllers are better in this regard. While they, too, can suffer from similar tracking issues, it's not even nearly as bad. In my experience you can often play for even 30 minutes or longer without the tracking veering off too much. (It also seems to be somewhat self-adjusting, meaning that just rotating the controller around a bit will readjust automatically the tracking system, and orient it properly in-game. However, sometimes it starts veering off in position rather than orientation, especially veering closer to the camera than it should.)

I predicted, however, that playing with them (as well as the equivalents in the HTC Vive and the Oculus Rift) would be tiresome for your arms. As pictured on the right, how long do you think you would be able to hold those controllers, with your arms extended like that?

In this case I was kind of half-right. When playing Farpoint with the Aim Controller, I indeed experienced my arms getting tired after a while, especially in sections of the game with long battles, requiring constantly holding the controller up, aiming at enemies, for extensive periods of time. The longer I played like that, the more tired my arms became, until it was almost impossible to play.

The "half-right" part comes from the fact that in most games you don't actually need to be holding your arms extended constantly. (Or, at least, in the three games so far that I have played that have required tracked controllers.) During the periods between having to use the controllers, you can rest your arms on your lap (well, at least if you are playing sitting down!)

While the Move and Aim controllers increase immersion, overall I still found Robinson: The Journey to be the most enjoyable playing experience because it does not used a tracked controller at all. It's played with the normal DualShock 4 controller, without tracking, which means that you just play it as you would play any other game. Ultimately I found this the most comfortable and likeable way to play a VR game. Tracked controllers may increase immersion, but in the long run they feel gimmicky and needlessly strenuous to your arms. (But I will fully grant that a tracked controller is better for aiming and shooting in VR.)

Nausea


Nausea, or motion sickness, has always been a stated issue with VR and how games should deal with it. It has always been stated (and I fully believed it) that if the in-game camera movements do not match your physical head movement, it would very quickly cause strong nausea. Even if what's simulated is eg. you sitting in a car, it would cause nausea because you can't feel the car turning, accelerating and decelerating as normal. This has been one of the major arguments against sit-down VR and for "room-scale" VR, and it has been one of the major arguments against the idea of traditional FPS games supporting VR.

I expected that nausea would be a problem for me, but that I would get used to it. Short sessions, taking a pause when feeling nausea, getting myself accustomed to it... no matter how long it would take, I would eventually get used to it. Other people report getting used to it, so I was sure I would too.

Three of the games I have played so far have great theoretical potential to cause nausea (in the order in which I played them):

Until Dawn: Rush of Blood is essentially a rollercoaster ride, and at points it really seems to want to cause the player vertigo by having huge downslopes that are ridden at great speeds. It almost feels like the game creators wanted the player to throw up.

Farpoint resembles somewhat a traditional first-person shooter, in that you move around with the thumbstick (in the Aim Controller). The movement is not exceptionally fast, but it's normal forward/backward/strafe movement with the thumbstick, as in any FPS game. (The other thumbstick does not rotate the camera. You just look around with your head, and aim and shoot with the Aim Controller. The game has been designed so that you don't need to look nor shoot behind you.) Since in-game movement does not match your physical movement, this has great potential to cause nausea.

Robinson: The Journey resembles even more a first-person shooter in that it's played with the DualShock 4 controller, and not only do you move as normal with the left thumbstick, you also rotate the view with the right thumbstick. The rotation has been limited to only horizontal rotation (after all, you can just look up and down with your head), and the rotation happens in discrete steps rather than being smooth (although the camera very quickly rotates between these steps, rather than just jumping outright.) Certainly these limitations exist to diminish nausea, but it still has very great potential to cause it. And if you search the internet, you will find lots of people reporting feeling nauseous.

So, how much nausea did these games cause me? How horrible was it?

Nothing. No nausea. None at all. Even those vertigo-inducing rollercoaster rides had absolutely no effect. The first couple of times I played Farpoint, there might have been a bit of discomfort, but after that there was none, no matter how long sessions I played. With Robinson I never felt any sort of nausea at any point in the slightest, even though my longest contiguous playing session was several hours long (I think it was something like 3 or 4 hours of contiguous gameplay without pauses.)

I was actually very positively surprised how little nausea I have experienced with the PSVR, even in games where theoretically I should have.

I suppose that this makes me extra right: Not only was I right in that I would get used to the nausea... but in fact it turns out that I don't have the problem at all.

That's not to say that if I got to play a traditional FPS game (like Portal 2) with no nausea-reducing limitations at all, that I wouldn't get nauseous. I might well do. But this experience has convinced me that I would most probably get used to it quite quickly, assuming I would have any nausea problem in the first place.

Resolution


This is something that I actually did not predict. Low resolution has always been stated as one of the problems with current VR, but I didn't expect it to be as bad. I was more expecting it to be more of a problem with the so-called "screendoor effect" in the HTC Vive and the Oculus Rift, which is caused by the visible gaps between pixels, and which is almost nonexistent in the PSVR (because, as far as I know, the PSVR uses some other type of display panel than those other two, and which has almost no screendoor effect at all.)

After all, the PSVR has a full-HD 1920x1080 pixel display. How bad can that be? Sure, you probably can see the individual pixels, but surely it can't be that bad?

I was quite negatively surprised about how bad it really looks. It quite literally looks like you are looking at an old 640x480 pixel CRT monitor. It's that bad. Especially if there is no antialiasing, or very little antialiasing, images are really and very obviously pixelated, to the point of being outright bothering. This image (from a previous blog post) demonstrates how it looks like on the monitor, and how it approximately looks like with the PSVR (click the image for a larger clearer version):


Even when the image has extremely high antialiasing, it still looks like looking at an old CRT TV. The pixelation is much less obvious this way, but it still doesn't look sharp, like with normal displays.

That being said, heavy antialiasing does remove the bothersome nature of the pixelation. Robinson: The Journey succeeds in this quite marvelously. It indeed looks a bit like looking at an old CRT TV, but on the other hand the pixelation is almost unnoticeble, so it looks quite good.

Most PSVR games so far, however, do not use antialiasing that strong, which often makes them look quite bad.


Gameplay limitations


One of my biggest issues with VR has been how much the format limits gameplay and game mechanics (or, to be more precise, how much game developers limit themselves when creating VR games). Will VR be eternally relegated to vehicle simulators and Myst-like games (and other such games where you essentially stand or sit still, and just "teleport" around)?

In some aspects I was right, in others wrong. In the previous section I already described some aspects of this.

I was actually positively surprised that two of the games I have played so far had free movement using a thumbstick, rather than using some idiotic teleporting mechanic. I'm also surprised how little nausea or discomfort this caused (that is: None at all.)

Those games prove beyond any shadow of a doubt that you don't need a stupid teleporting mechanic for these games. I'm actually a bit surprised that the game developers dared to defy the (quite false) wisdom, and went for full-on regular old thumbstick movement. It also proves wrong all the masses of VR fanboys who kept pounding on how the teleporting mechanic is necessary, and how if the game were to move in the traditional way it would cause projectile vomiting in seconds. I very strongly suspected this to be the case (ie. them being wrong), and I was proven completely right.

That being said, both games still have their limitations, both because of VR and to reduce potential nausea.

Farpoint has been designed to be played as a sit-down experience, which is great (because, as said, playing standing up for long periods of time is just not feasible.) However, severe compromises have been made in the game because of that. All levels have been designed so that advancing in them can be made within 180 degrees. You never need to move back. Enemies never attack you from the back (which would feel quite unfair, if you are sitting down). Even if an enemy is behind you, it will move in front of you (which sometimes looks a bit ridiculous because the enemy will just run right past you, to your line of fire, rather than shooting you in the back, which would make most sense.)

Robinson: The Journey has also been designed to be playable while sitting down, but it gets rid of that limitation by allowing the view to be rotated horizontally, as with any regular FPS game. As mentioned, however, the rotation is not smooth, but done in discrete steps. While the jumps between steps are not immediate, and instead the camera rotates very rapidly between them, it's still there. (Of course given that you can look around freely with your head, this is not really of a limitation in terms of where you can look at.) This discrete camera rotation has certainly be added to reduce nausea. In my case, at least, it works like a charm for that purpose.

Neither game (or any of the other games I have played) has jumping or crouching as a game mechanic. (Of course if you are standing, you could jump and crouch if you really wanted, but there's little to gain from it. It also would probably be inadvisable, especially jumping.)

The latter game has climbing as a game mechanic, though, and it works surprisingly well. Kudos. (It can be slow at molasses, though. Of course it's more of a puzzle game than a hectic Doom-style first-person shooter, so it doesn't really matter.)

The one thing I'm most glad of is that the game developers discarded the stupid idea of teleporting. It makes the games infinitely more enjoyable and playable.

Saturday, June 24, 2017

"People of color" is a racist term

It has always been a joke that the politically correct term for black people is ever-changing. The old term becomes "racist" (somehow), and the new one is now the "politically correct" one.

Well, it appears that the current "politically correct" term for non-white people is "people of color".

For example, Anita Sarkeesian, in one of her most recent videos, calls all non-white people "people of color". That's including Chinese and Japanese. (She specifically shows examples of Chinese and Japanese, and refers to them as "people of color".)

Well, I have a question for Anita: What, exactly, is the "color" of Chinese and Japanese people?

Uh-oh... We have a problem here, don't we?

Thursday, June 22, 2017

Aspect ratio sensitivity

For some reason I cannot really even begin to comprehend, it appears that most people are completely unable to see if the aspect ratio of a live video (such as a TV show or movie) is incorrect. And I mean even if it's way, way, way incorrect. Like when a 4:3 video is stretched horizontally to fill a 16:9 screen (which means that the video is compressed vertically to 75% of what it ought to be). I don't know if they indeed have a brain malfunction that makes them completely incapable of seeing the problem, or if they just refuse to admit they see it due to some strange psychological phenomenon. In either case they will claim to their graves that the glaringly obvious stretching of the image doesn't bother them at all (again, for a reason that I cannot even begin to comprehend).

I myself am what could perhaps be called hypersensitive to a wrong aspect ratio. If some video footage has even slightly the wrong aspect ratio, I very quickly notice it, and it bothers me.

For example I was recently watching a video and I very quickly noticed that it seemed to have a slightly wrong aspect ratio. I kept watching, and I became more and more convinced of that fact.

It turns out that the video was supposed to have a 16:9 aspect ratio, and while it had 1280 pixels horizontally, rather than being 720 pixels vertically, for some reason it had been squeezed to 668 pixels. That's less than an 8% difference. I still noticed it almost immediately. When I scaled the video vertically to 720 pixels, it immediately started looking correct.

I have yet to meet another person with this kind of "hypersensitivity". Or at least one that admits to seeing it. Much less one who is bothered by it. I don't really get it.

Wednesday, June 21, 2017

Canada's tough stance against wife-beaters

Recently a man in Canada beat his wife for half an hour with a hockey stick. He pulled her hair, hit her in the face, and threatened to kill her. She had to be hospitalized.

The man pleaded guilty in court. Can you guess how long his jail sentence was for this brutal crime?

Perhaps eight years? That could sound about right. But no.

Perhaps eight months? Sounds perhaps a bit on the low side, but might still be barely acceptable. And no.

Perhaps eight weeks? Would sound way too short for such a heinous crime.

No. Eight days.

That's right. Eight days of jail.

That's quite a strange amount of leniency in a country whose government prides itself for being feminist, and whose prime minister not only proudly declares this, but just can't shut up about feminism.

But perhaps the man's name reveals the reason for the incredibly lenient punishment: Mohamad Rafia.

Way to go, "feminist" Canada. You are the true defender of women against spousal abuse.

Sunday, June 18, 2017

Prejudice against 3DS grip attachment

I have owned a (New) 3DS for a couple of years now, and a regular DS for a couple of years before that. I have always found it really unergonomic and uncomfortable to use. The positioning of the left stick and the d-pad is not very ergonomic, and requires some uncomfortable thumb contortion. (Perhaps the worst case scenario is having to use the left shoulder button and the d-pad at the same time. It might not sound like that much of a deal, but it really is quite uncomfortable.)

Some months ago I realized that there exist grip attachments for the 3DS, and they are actually really cheap. I purchased one that looks pretty much like this:


The grip attachment makes it a hundred times more comfortable to use, and I'm not exaggerating a bit. It might not be absolutely perfect, like it were a modern controller, but it's still a hundred times better. I wouldn't go back in a million years.

So far three acquaintances who own a 3DS have seen me playing with it. What kind of comments did they make (completely unprompted)? Did they perhaps consider it cool? Maybe they wanted to try it to see how comfortable it is? Or perhaps they mocked it lightly for looking so funny, but were otherwise cool with it, understanding how much more comfortable it makes using the device?

No. All three of them immediately assured me how they don't need such a thing, and how useless it is.

And mind you, these were three completely separate events; it wasn't like all four of us were gathered at the same place at the same time, and they all commented it agreeing with each other. No, each encounter was separate, and they didn't know of each others' opinions. All three independently commented how they don't need such a thing.

And no, none of the three had ever tried one. They still didn't "need" one, though, as they assured me. For some reason. Without me asking anything.

One person saying that, ok. Two people saying pretty much the same thing? A bit strange, but alright. Three people independently saying pretty much the same thing? Now that's a symptom of some kind of strange common psychological phenomenon.

I don't really understand where this strange psychological phenomenon comes from. Why did all three feel the need to assure me, without prompting, without me asking, that they don't need such a thing? And why were they all so negative about it? They didn't even joke about the thing (which I would have understood; it might look a bit funny at first). They just stated in all seriousness that they don't "need" it.

Strange.

Saturday, June 17, 2017

"Cultural appropriation" of white culture

Like everything that the modern social justice cult is promoting, "cultural appropriation" is one of those things that not only is envisioned by the social justice totalitarian regime, but is becoming more and more prevalent as legislation (due to the social justice religion being incomprehensibly virulent, and infecting government after government, and law-maker after law-maker).

"Cultural appropriation" means that white people are not allowed to do or use anything from non-white cultures. Of course such rules only apply to white people, as always, because social justice is deeply, deeply racist, and only blames white people for everything, and wants to relegate white people to subhuman second-class citizenship with less rights than everybody else.

The same rule doesn't apply between non-white people. And, very specially, it of course and naturally doesn't apply to other people "appropriating" white culture.

When these social justice warriors are confronted with that issue, they have a stock response that you will always hear: "What white culture?" Ask the question, and that's the response you'll get 99% of the time. (The other 1% will be some kind of variant that denies white people having any sort of culture.)

They have concocted, and they seriously believe, this notion that white people have no culture and no history of their own, and that they have never invented anything. That the only thing they have done during their entire existence is "steal" from other people. They seriously consider white people, somehow, a species of its own that is incapable of doing the same things as non-white people can, and is able only to steal and copy others.

But of course white people have a long and rich culture in all aspects of life, such as art, culinary arts, architecture, clothing, music and literature.

As an example, so-called "high fantasy" is pretty much solely a product of white people. It's heavily based on old Norse culture and mythology. Most concepts used in high fantasy come from that, such as wizards, elves, goblins, dragons, orcs, and so on and so forth. (In other cultures there may be similar fictional concepts, but Norse mythology is not based on them, and formed on its own. Just because there is some superficial similarity between two things doesn't mean that one was based on the other.)

Tolkien, one of the "fathers" of modern high fantasy, based much of his fiction in Norse mythology. Even his fictional languages were largely based on European languages, especially Nordic ones.

Another big influence in modern fantasy, and many other forms of fiction, is the Greco-Roman mythology. Tons of things are influenced pretty much directly from it. Whenever there are nymphs, fairies, winged horses, bull-headed humans, multi-headed hydras and so on and so forth, they come directly from Greco-Roman mythology.

In fact, much of modern European-American civilization, how society works, and how governments run, is rooted in the ancient Roman society and government. Much of modern art and architecture started from ancient Rome, and it has great influence to this day.

Almost all of historical art existing today in the west, such as paintings and sculptures, were made in Europe, using techniques developed in Europe. Classical music, and music theory, is deeply rooted in Europe.

That white culture.

But, of course, social justice warriors will deny those things being "white culture", because they cannot accept white people having anything of their own. If everything else fails, they will find even the tiniest detail in those things that was influenced by some other culture, no matter how minor and inconsequential, and declare the entire thing part of that other culture and "stolen" from it. And if even that fails, they will find a similarity to something in another culture, and declare that because there is similarity, that's not "white culture". (And of course it never works in the other direction, because reasons.)

Friday, June 16, 2017

A very strange form of racism in Japan

I once saw a YouTube video (which I can't find anymore) of the rather strange experience of a guy teaching English in a school (I think it was a high school) in Japan.

The guy was born in the United States, and one of his parents was American and the other Japanese. Naturally he was always interested in visiting and perhaps working in Japan. Since he was a university graduate on the subject, and had all the necessary qualifications, he decided to seek for a job there as an English teacher.

In the video he goes to lengths explaining the experience, but I'm going to summarize the interesting bit here: In the school where he first got the job, the principal (a native Japanese) was strangely antagonistic against him. She would keep demanding lesson plans far beyond what was reasonable, she would assign him unreasonably many teaching hours, she would often go to his class, in the middle of the lesson, to berate him for something, in front of the students, and so on and so forth. It was quite clear that the principal was trying to make his life as miserable and hard as possible, without outright doing anything explicitly illegal. She quite clearly wanted him to leave the job.

And it worked, too (although according to him, probably for the better). He sought a job on another school, and finally got it.

But why was the principal at the first school so antagonistic?

It turns out that it's a rather strange form of racism in Japan. Not all places, but many. It turns out that it was because he looked too Asian, too Japanese. (While half-Japanese, half-American, he certainly looks more Japanese than American.)

You see, in many places in Japan there's this quite strange prevalent prejudice that people want English teachers to look foreign, rather than Asian. They don't trust a non-foreign-looking person to be a good English teacher. If the teacher looks too Japanese, they don't trust him to be competent at the job. The prejudice is so strong that it doesn't even matter if the teacher has lived his entire life in the United States and has plenty of academic qualifications. It was not only the principal of the school who was prejudiced like this, but also many of the students' parents.

Japan has quite an obsession with the English language, for certain, but this is one of the strangest forms of it that I have ever heard.

Thursday, June 15, 2017

PSVR + 4k display problem explained

The PSVR has a (rather incomprehensible) technical drawback when it comes to the PS4 Pro and a 4k display.

The PS4 Pro has an HDMI 2.0 output connector, which allows it to use an HDMI 2.0 capable 4k display at that resolution, at 60 Hz, in RGB mode (which means essentially a completely lossless picture). Also if the 4k display supports HDR, it requires HDMI 2.0.

The PSVR, however, degrades this scenario to HDMI 1.4, because the processing unit box that sits between the console and the display has only HDMI 1.4 connectors. This means that if the PSVR processing unit is in use (which is necessary for PSVR to work), then the 4k display cannot be used in RGB mode, nor using HDR. The display will still be used on 60 Hz, but in YUV420 mode (if the display supports it), which has degraded color quality (essentially it's a lossy image, where there is less color information, causing less vibrant colors, and some color artifacts).

The following diagram illustrates the exact situation (click the image for a larger version):


The HDMI cable connections marked with blue lines indicate the setup when using the PSVR. The connection marked with the green line denotes a setup where the PS4 Pro is connected directly to the display.

These two setups are mutually exclusive, because the PS4 Pro has only one HDMI output. Both the blue and green connections cannot be made at the same time. This moreso if the display itself has only one HDMI in connector (which is quite common, even with 4k displays).

This situation is very hard to fix even with HDMI switcher boxes (especially if the display has only one HDMI connector).

For starters, most HDMI switchers are designed to have two or more input HDMI connections, and one output connection (because their idea is that you can choose to display one of multiple picture sources). This might theoretically help with half of the problem in the diagram above, on the display end (where the input signal may come from either the console or the processing unit). However, it still doesn't help with the fact that the console needs to output to either the display directly, or to the processing unit.

This would require a quite complex HDMI switcher box, which is able to take the input from the PS4 and redirect it either to the display directly, or to the processing unit, and in the latter case it also needs to get the input from the processing unit and redirect it to the display. So in essence it requires redirecting input A to output A, and at the same time redirect input B to output B. And, as an alternative, it needs to be able to redirect input A to output B (and ignore input B). I'm unsure that such HDMI switches even exist.

You could try to get away with it by having two HDMI switches, one on the PS4 side, and another on the display side. However, the one on the PS4 side would still need to be of the type that takes one input and is able to redirect it to two optional outputs. As far as I know, this is a rarer form of HDMI switching.

Secondly, the switcher(s) needs to support HDMI 2.0. The PS4 Pro is really picky about this (probably because of technical reasons rather than deliberately). If you connect an old cheap HDMI switcher between it and the 4k display, it just won't work. The display won't receive any signal. While HDMI 2.0 supporting switchers may be becoming more common, in my experience they still tend to be quite rare. Also some research reveals that people who have tried to solve this situation with switchers have found out that it, indeed, simply doesn't work. The PS4 Pro seems to be really picky about something being between it and the display (even if that something supports HDCP.)

The only practical solution is to manually switch connections every time you want to switch between PSVR games and other games. That is, in the diagram above, switch the green connection to the blue one both on the PS4 side and the display side (if your display has only one HDMI port).

A slight alleviation can be achieved by having a display with two HDMI ports, or using two displays (in which case only the connection on the PS4 side needs to be manually changed).

Monday, June 12, 2017

Microsoft still has time to change the name of their new console

On this year's E3 conference (which happened yesterday as of writing this), Microsoft finally revealed their new upcoming version of their Xbox console line, which previously had the working title "Project Scorpio", planned to release on November of 2017.

The name of the new console? Xbox One X.

Many people were quite appalled and disappointed by that name ("what was wrong with "xbox scorpio"? That would have been an awesome name!"), and many have raised a concern about how much confusion it might cause, because Microsoft's previous minor upgrade to the console line was the "Xbox One S", which sounds very similar when pronounced out loud.

This may well cause confusion with eg. parents buying the console for their kids as a Christmas present, if they know very little about consoles, especially about a specific line of consoles.

To be fair, "Xbox Scorpio", or "Xbox One Scorpio" (if they want to denote in the name that it's fully compatible with the Xbox One) might be slightly confusing as well because they named their previous upgrade "Xbox One S", so someone might think that the "S" stands for "Scorpio", that it simply has been shortened.

But it really seems that, for some reason, Microsoft just loves confusing names in their consoles. The name "Xbox One" itself was already confusing enough because that's the exact name that was commonly used for the very first Xbox console (at least when spoken out loud), to differentiate it from the Xbox 360. I really have no idea why the decided to name it like that. It seems really silly.

I wonder if Microsoft will change their mind and name their upcoming console in a less confusing way. They still have time.

Sunday, June 11, 2017

My experience at applying for a job at Ubisoft

I haven't talked much about this particular incident with many people, nor ever written about it, but here it is.

Disclaimer: This is not criticism against Ubisoft as a company. It's just a recount of my experience with one particular individual manager at Ubisoft Montreal, who was tasked to interview me.

This happened about 12 years ago or so. I had a good friend who was working as a lead developer at the time at Ubisoft Montreal, in Canada, and he had suggested, and I entertained the idea, that I would apply for a job there as well. Since I live in Finland, that would have been quite a change in my life. But I finally decided that what the heck; being a game developer was the dream of my life, and here was a golden opportunity to fulfill that dream. The opportunity was made even easier by the fact of having a connection inside the company, who could speak on my behalf, and if I got the job, would help me settle in. What more could I hope for?

So my friend arranged for me to have a job interview via phone.

The interview was less than stellar. The person on the other side (from what I understood, some kind of project manager) was not a native English speaker, and spoke with quite a heavy, and for me at the time a bit difficult understand, foreign accent. I, of course, am not a native speaker either, nor had an enormous amount of experience at speaking English out loud. And, it appeared, as much as I had sometimes difficulty understanding him to some degree, he had even more difficulty understanding me. (For example, for some reason, he could simply not understand the word "graph", when I tried to explain about my experience at programming tools for graph manipulation. It was a bit weird. I don't think that word is so hard to pronounce or understand...)

Anyway, after something like 20 minutes of somewhat painfully trying to wade through some basic questions, the interview was finally at its end, and he quite clearly and unambiguously stated that the job application procedures would continue from here, and that the next step would be an interview in person, meaning that I would have to go to Montreal for a live interview (Ubisoft paying the expenses). He made this quite clear, and there was no possibility of misunderstanding.

It turns out that saying that was quite a dick move.

I was fully expecting for it to happen. At the very least the live interview itself, if not getting the job and moving to Canada for the foreseeable future. I was fully mentally prepared to move to the other side of the world for this.

Weeks went by, and nothing. No emails, no phone calls, nothing. Weeks turned into months, and still nothing. Yes, in retrospect I should have contacted them after a few weeks, or after a month, and ask them about the situation (and nowadays I would absolutely not hesitate doing that), but back then I was younger and less experienced in life, so I just didn't. Stupid, I know, but I was younger and more naive.

So some months went by, with absolutely no contact, and then I had a conversation online with my friend (we weren't communicating all that often), and I mentioned about this situation, and he told me something along lines of "oh, they haven't told you? They decided not to hire. I though they would have told you." I understood from the rest of what he said that the manager had made the decision pretty much immediately based on the phone interview.

He never bothered even sending me an email to notify me.

Which is why I think it was such a dick move. From what I understood, that manager had decided immediately from how badly the phone interview had gone that he wouldn't hire me, even though he said otherwise at the end of the call, and he never informed me that there would be no live interview. Ok, I can give him the benefit of the doubt in that perhaps he wasn't thinking of it while the phone call was going on, and instead decided it afterwards; however, not even bothering to inform me was quite a dickish move. I literally lived for a couple of months fully expecting to move to Canada (or, at the very least, get a trip there for the interview.)

Yes, as said, I should have contacted them and ask, so I was being fool myself as well, but I think common courtesy would dictate for that manager to inform me that he had reversed his decision, rather than leave me hanging.

Friday, June 9, 2017

The dilemma of Microsoft identifying your PC

I purchased the PC I'm currently using several years ago. Alongside with it, I purchased Windows 7.

A while back Microsoft offered a free upgrade to Windows 10 to all Windows 7 (and 8) users. (They really, and somewhat obnoxiously, pushed this upgrade onto people, but whatever; that's not the subject of this post.) I took the upgrade, because why not.

The offer of the free upgrade was a limited time thing, though. I think it was a year or something like that. As far as I know, they are not offering it anymore, meaning that if you today want to upgrade from Windows 7 or 8 to 10, you'll need to purchase the latter.

This got me thinking: I don't actually have a physical copy of Windows 10. It was an upgrade over the installed Windows 7 done through the internet. I only have the physical disk for Windows 7. So what happens if for whatever reason I need to reinstall Windows from scratch? (For instance, the most likely scenario for this is if my hard drive completely breaks.) Would I need to purchase Windows 10, if I want to keep using it? Or would I be stuck with Windows 7?

However, I did some research on this subject and it appears that if you own a valid product key for Windows 7 (which comes with the physical disk), you can download and install Windows 10 from Microsoft and register it using that key. That's quite nice.

However, apparently, it goes even farther than that. You actually don't need the Windows 7 product key either. As long as you have had a legal version of Windows 10 installed in your PC, you can re-download and re-install it on that same PC, and it will detect that it had been legally installed before, and will register itself again. Therefore even if you lose your original installation disk, with the product key, you will still be able to re-install Windows into the same PC.

I don't know exactly how it recognizes that it's the same PC, but I'm supposing it's using a cpu-id, and perhaps other uniquely identifying hardware info, that it has sent to Microsoft when you registered Windows.

This, of course, raises contradictory feelings.

On one hand it's really nice to have this kind of safety. If for example your hard drive fries, you don't have to worry about having to purchase Windows again, if you don't have the original disk. Microsoft could have easily gone the greedy route and said "oh, your computer fried? Well, sucks being you. You'll have to pay us again for Windows." But instead, they are being really fair, and if you have purchased Windows once, you don't have to do it again (at least if you are installing it on the same PC.)

On the other hand, Microsoft has uniquely identifying information about your PC. Microsoft can, if they so choose, see pretty much everything you do with your PC and identify exactly which PC it's coming from. Any person with a sense of privacy ought to get shivers going down their spine.

One has to simply trust that a corporation like Microsoft is not going to abuse this kind of information. Given Microsoft's less-than-stellar reputation, they haven't exactly earned this kind of trust, at least not fully.

So it really is a dilemma. On one hand they are being incredibly fair by offering this kind of service where you don't need to purchase Windows again due to an accident, which is very nice. On the other hand, they have uniquely identifying information about your PC, which is a bit scary of a thought.

Wednesday, June 7, 2017

An additional reason why ad blockers are good

Some time ago I decided to disable my browser's ad blocker plugin on channelawesome.com because I wanted to support the site even if it's just a tiny bit. The ads there are not completely on-your-face nor do they hinder browsing and enjoying the site, so why not.

However, I later enabled the ad blocker again, because the ads had another negative side-effect.

From time to time I browse the list of new videos on the site's front page, and open those I want to watch on new tabs. I might open half a dozen tabs at once, or even more.

The problem with the ads is that they are really heavy for the browser to render. They are constantly changing, they play videos, and whatnot. With half a dozen tabs opened, all of them full of ads, the browser becomes really laggy. Just scrolling one page is really laggy. CPU usage is very high.

So I enabled the ad blocker and reloaded all the tabs, and what do you know, CPU usage went way down and there is no lag anymore. Scrolling the pages is silk-smooth again.

So yeah, another reason to disable ads, even at sites where I would like to enable them for support.

Monday, June 5, 2017

I got a PSVR. Is it any good?

If you have read this blog, you'll know how much I have ranted about my disappointment in how VR turned out to be, and how it has all the signs of being in danger of becoming a commercial failure of the same kind as the Kinect and the PS Vita. And of course one of my major gripes about it is the exorbitant prices of the headsets (which are more expensive than your average PC; even one that meets the minimum requirements.)

That doesn't mean, and has never meant, that I wasn't eager to get my hands on one of the headsets and try it for myself. It's just that I was not really ready to pay the exorbitant price for a device that at this point is mostly just a gimmick to run a few demos and small indie games, with an abysmal library of triple-A games.

A couple of weeks ago there was a deal in an online store here, where they were selling a bundle that contained the PSVR headset, the game Farpoint, and an Aim Controller, for cheaper than the headset alone would cost normally. So I finally decided to bite the bullet and purchase it (even though it still costed more than a brand new PS4 Pro, but what the heck.)

So, what's the verdict? Was my mind blown away? Did I see the error of my ways, and did all my doubts just vanish, and was I "converted" to the religion of VR?

No. The experience was about what I expected. It wasn't especially surprising or mind-blowing.

I have tried the Oculus Rift Development Kit 1 in the past, and while it had a lower resolution (and thus looked pixelated as hell), I knew pretty much what to expect.

That's not to say I didn't like it. I did. The stereoscopic effect is quite good, and I was actually a bit surprised about how stable the head-tracking is, and how you are able to look at pretty much any direction (including way up and down) without tracking problems, considering that the tracking is done by a camera (although, I think, there are also accelerometers in the headset itself).

It can be quite annoying when the tracking has serious lag issues when the console is loading something (eg. starting up a game, or loading a level). I wish they had implemented a solution to that. However, almost without exception within a game the tracking tends to be very stable and reliable.

For some reason, however, the tracking of the DualShock 4 leaves more to be desired (in games where it's being tracked and shown). Its orientation within the simulated world tends to drift, as it's being used. You can reset its orientation (by keeping the options button pressed for a second or two), but it can become a bit annoying if you have to do this every couple of minutes.

My biggest disappointment, however, is the pixel resolution.

The headset has a full-HD 1920x1080 display panel, so one would think that ought to be more than enough for a very decent picture quality. Most reviews, however, mention how regardless of this there's still very visible pixelation, which can be annoying.

They were right. It's hard to imagine how bad the pixelation is until you see it for yourself (the irony of which doesn't escape me, considering what I said in the other article I linked about in my "verdict" paragraph.)

The viewing angle of the headset is somewhere between 100 and 110 degrees (it's hard to find an exact figure). This would be approximately equal to watching a 22-inch 1080p monitor at a distance of about 20 cm, or the like. However, the pixels look much bigger than that with the headset than with a monitor (looked at that distance). I don't know what the technical reason for it is, but it might be that due to how it's physically implemented, how the lenses work, it might be that the lenses magnify the pixels at the center of the view.

In some games it's more bothersome than in others. For example the Resident Evil 7 Kitchen Demo was particularly annoying in this regard. An example screenshot of the demo:


This is a detail in that image, how it looks normally on screen, and how (approximately) it looks with the PSVR headset (click the image for a larger version):


You might think I'm exaggerating, if you have never used a PSVR, but it really looks that bad. And it looks even worse when animated because a moving image just emphasizes the pixelation.

I have never used an HTC Vive or an Oculus Rift, but I doubt they are significantly better in this regard. While the PSVR has 1080 horizontal lines, those have 1200 lines, which is only 10% more. I doubt this improves the pixelation problem by much.

In fact, based on this experience I would guesstimate that even a 4k VR headset would have some pixelation problems. It would, of course, be significantly better in this regard, but I'm guessing that you would still be able to see the pixels quite clearly. Which seems incredible given how dense a 4k display is, but on the other hand, it just doubles the amount of pixels on each axis compared to 1080p.

The resolution is, of course, a technical limitation. Surely if consoles and PCs had a hundred times as much rendering prowess as they currently have, and if display technology was ultra-cheap and we could easily use 8k displays in VR headsets and have PCs and consoles run games easily using that resolution, that would be done. However, that's not where technology is currently, and we simply have to live with it.

This is, in fact, one additional notch to my list of disappointments about VR: Low resolution, which feels like quite a big step back in gaming technology.

Speaking of 4k, PSVR has a really annoying technical limitation, which is quite incomprehensible, given that PSVR and the PS4 Pro came out about at the same time (and thus they were developed pretty much concurrently).

The PS4 Pro has an HDMI 2.0 output. This means that it's able to output to a 4k display at 60Hz in RGB mode.

However, the processing unit of PSVR (which sits between the PS4 and the display, and which the headset also is connected to) has only an HDMI 1.4 output. HDMI 1.4 cannot output to a 4k display at 60Hz in RGB mode (because it doesn't have the required bandwidth.)

This means that when you have the PSVR set up, the PS4 Pro cannot connect to your 4k display directly, using HDMI 2.0. It can only connect through the PSVR Processing Unit, meaning it can only connect to your display using HDMI 1.4, which means it cannot display in RGB mode.

It is able to use the 4k display at 60Hz, but only in YUV420 mode (which requires less bandwidth), if the display supports it. However, this is an inferior mode with a reduced color resolution (YUV420 works by converting the image to brightness and color components, and then reducing the resolution of those color components; the same technique as old analog TVs use.) The colors are not as vibrant, and there may be visual artifacts in the colors (similar to what happens in analog TVs). Also, if the display supports HDR, it can't be used in this mode.

This is quite a major hindrance, if you have a 4k display (as I do.) Either you will have to use the console in an inferior display mode, or you will need to physically switch cables every time you switch from a PSVR game to a regular game, and vice-versa. (No, there is no way around this. You really have to physically switch cable connections. Any sort of HDMI switchers won't work with this.)

Sunday, June 4, 2017

European culture is dying

During the history of humanity, nations, societies and cultures have risen and fallen countless times. Some have prospered for extremely long time, others were short-lived. Some were rather insignificant, others were the pinnacle of human civilization and progress. The reason for their fall are very varied. Some were conquered by outside forces and effectively destroyed. Others destroyed themselves from the inside due to greed and hunger for power. A few even simply just dissolved pretty peacefully, perhaps because civilization just progressed and society changed.

But however many civilizations have such disappeared, I'm not aware of a single one that actively sought to destroy itself, to destroy its own culture and society, and very existence, by willingly and actively bringing other cultures and societies into itself, and actively shunning and squashing its own. I'm not aware of any such society that hated itself so much that it pretty much effectively wanted to commit societal suicide and sought for itself to disappear and to be replaced by foreign cultures from distant lands. Many had healthy patriotism, others had unhealthy nationalism, others were somewhere in between, but no society or nation I'm aware of hated itself so much as to want to disappear, as to want to be invaded by other cultures and destroyed.

Except modern-day Europe, that is.

The modern societal European zeitgeist is one of absolute self-hatred, and self-destruction. Europe hates itself, and wants to be invaded by "exotic" cultures from distant lands, and wants to be completely replaced by those other cultures, no matter how regressive they might be. Europe refuses to see anything positive about its own culture and history, and concentrates solely on the negatives, isolating and exaggerating them, and defining itself solely by them. Europe, as a collection of cultures and rich history, wants to disappear and to be replaced by other cultures, and is actively seeking to bring those cultures into Europe to perform this replacement. It's like a reverse invasion. Rather than Europe going somewhere else to invade, it's inviting outside forces into Europe to invade it, until Europe has pretty much disappeared.

This angers and saddens me. Europe has such a long and rich history and culture. Just think of the great achievements of European architecture, literature, music, culinary arts, performing arts, paintings, sculptures, monuments. Think of the technological, scientific and engineering progress, the education system, the welfare system, city planning, schools, universities, libraries, hospitals...

At this rate, it's all going to disappear.

Think of, for instance, the Library of the Royal Society. It has in its collections innumerable amounts of priceless historical documents and objects that are absolutely invaluable and a testament of how humanity has progressed. Even in the absolute best case scenario they will just fall into complete neglect, and probably be eventually destroyed by fires or simple decay. In the worst case scenario it will all be purposefully destroyed by an oppressive ideology that hates western culture.

Think of all the works of art in European museums. All destroyed, by neglect if nothing else, but probably by active vandalism.

Our modern education and welfare system? It will be gone. Schools, universities, libraries, museums... all gone. Our freedoms and liberties? Gone. Our very concept of a free European society? Those basic human rights that you take for granted? All gone.

What happened to Iran in the 70's will happen to us. Yes, Iran was pretty much a free modern western society in the 70's. Just make a google image search for "iran in the 70's" and be surprised. Today Iran is one of the most oppressive and regressive countries in the world.

And the saddest thing is that this will not happen to Europe because foreign forces invaded it. It's happening because Europe hates itself and wants it to happen, and is actively seeking for it to happen. Europe wants to be destroyed as a culture, and erased from history. It wants to be replaced by a regressive totalitarian regime. All this will be gone.

Saturday, June 3, 2017

Revolutions per minute

Do you know what's a unit of measurement that just hate? Revolutions per minute.

Why is it "per minute"? Why a minute? This is the only unit of measurement I'm aware of that uses one minute as its time unit. Who exactly came up with this?

The problem is that it's so nondescript. Unless you have dealt with this particular unit of measurement a lot, and have a lot of experience with it, it doesn't really tell much. It doesn't give you a mental picture of how fast something is rotating.

If something rotates, let's say, 150 revolutions per minute, how fast is that? How about 2000 RPM? Can you form a mental picture of something spinning at that speed? I can't. One minute is such a long period of time that it's really hard to just visualize it in your head.

Of course you can get revolutions per second, which is much easier to grasp as a mental picture, by dividing by 60. It's just that dividing by 60 is not an easy operation to quickly do in your head.

150 RPM is 2.5 revolutions per second. This is something much easier to mentally visualize. Just imagine in your head a disc rotating 2.5 revolutions in one second, and you'll get a pretty good concept of how fast it's spinning. Likewise 2000 RPM is about 33 revolutions per second. Again, much easier to visualize that it's pretty damn fast. The "2000" figure gave a feeling of fast spinning, but it was much more abstract and harder to concretely visualize in your head.

I just hate it when rotation speed is given in rpm. It just doesn't give me a quick mental picture of how fast it's rotating.

To be fair, there's another unit of measurement that is likewise hard to visualize without extensive amount of personal experience, namely km/h (or likewise mph if you are on the other side). Just because you can imagine something traveling eg. 10 kilometers in one hour in no way, in itself, give you a mental picture of how fast it's traveling locally. Is it walking speed? Running speed? Fast car speed? The only reason why most people are able to grasp the speed is because they have so much experience with those numbers (by driving a or traveling by car so much), and that's why they know that 10 km/h is something like jogging speed rather than something wildly different.

The problem with RPM contrasted with km/h is that the vast majority of people have almost zero personal experience with the former, while having ample experience with the latter.

Wednesday, May 24, 2017

Concessions for the limitations of VR

Farpoint is one of the first video games for the PSVR that's more than just a technology demo (not the very first one, but arguably among the first three or four at most). This paragraph in the IGN review of the game grabbed my attention:
Like any good VR game, Farpoint makes concessions for the limitations of virtual reality. For example, in story missions enemies only ever come at you from the front, which lets you play the whole game without having to turn around much. By traditional shooter standards it’s boring design, but in Farpoint it helps stave off the motion sickness some people experience in VR and avoids issues with moving outside the bounds of where the PlayStation Camera can detect you or your controller.
Yeah... concessions for the limitations of virtual reality. By traditional shooter standards it's boring design.

You know, like any "good" VR game. Because that's considered good, apparently.

I'm not saying it's a bad game (I haven't played it), or that the solution isn't reasonable. Or that this is a complete show-stopper. Probably a more minor issue. I'm just putting it out there for the people who still claim that VR will revolutionize the gaming industry, expand what's possible in video games, and that traditional games will become obsolete. Yeah, right.