Thursday, March 30, 2017

In defense of Google and YouTube

I have written in the past a couple of blog posts about how messed up the copyright system on YouTube is (among other things, How to steal people's ad revenue from YouTube, and How YouTube helps big corporations steal your videos).

However, to be fair, and while there are many things that Google could do better regarding those problems, these are measures that Google is pretty much forced to do in order to protect itself, and its users, from copyright lawsuits. Some things could be done better (such as not automatically and irrevocably giving ad revenue to somebody who merely claims intellectual property on a video), but overall it's something that they just have to do.

There is, however, something that Google is, at least so far, doing that deserves a huge amount of praise. You see, so far, Google appears to be one of the very few social media corporations who are standing for the freedom of speech of its users, rather than placating to entities, media, and sometimes even governments, demanding censorship.

This whole debacle about hundreds of corporations pulling their ads from YouTube, in some kind of semi-collusion or semi-agreement (verbal or implicit, nobody but they know), started in part with Google actually refusing to trample over its users' freedoms and censoring videos. It started with Google saying "no" to such demands.

So far Google, very much unlike other social media giants (like Facebook and Twitter), has stood for the rights of their users, to freely express their opinions, whether good or bad, and they deserve a standing ovation for that.

Let us hope that Google retains their integrity and doesn't cave in because of the enormous amount of pressure and attacks it's receiving from the media and virtue-signaling corporations and governments, who are heavily pushing for an extremist leftist political agenda, censorship and silencing of people with "wrong" opinions.

Let us hope that YouTube remains the largest, and perhaps last, bastion of free speech on the Internet. Freedom of speech is for everybody, not just for the people with the "right" opinions.

Tuesday, March 28, 2017

The Old Media is dying... and they are fighting back

If the election of Donald Trump showed us something, it's that the traditional media is dying. Newspapers, television, radio... it's all dying. They used to have a huge influence over the population, pretty much telling them what to think and how to vote. And they tried, on man how they tried, to make the population not vote for Trump. And they failed.

It was the final wakeup call. The final nail in the coffin. For years the media has been dying, and this was the final piece of evidence.

But the Old Media isn't going to give up just like that. They are not going to just admit defeat and go away silently. They are the establishment, dammit. They dictate how people must think, and how they must vote. They elect and run governments.

The biggest threat to the Old Media is undoubtedly YouTube. The "problem" with YouTube is that it allows people to communicate with millions of other people directly, without the filtering and biases imposed by the media. There is no establishment between producers and consumers, deciding what is and isn't good.

But how to kill such a giant as YouTube? One wouldn't think it would be that easy. But the establishment is smart. Surprisingly smart: If you want to kill YouTube, stop the incoming money flow. But how?

Well, the slanderous hit piece against PewDiePie, the biggest name on YouTube, was just the first step. It wasn't clear at first why they attacked him in particular. Why is PewDiePie their enemy? Why did they want him in particular down?

It turns out that it's not about PewDiePie himself. He and his content is irrelevant. They don't give a flying fuck about him. It wasn't really an attack on PewDiePie in particular. It was much more devious than that.

The real reason behind the attack was to show the world, and more particularly, to big corporations running ads on YouTube, that their ads are being shown on videos with "antisemitic content", and "hate speech", and all the other buzzwords that are so popular in the current political zeitgeist. By attacking the biggest name on YouTube they got the attention of the advertisers. Of course Disney and YouTube pulled their sponsorships from PewDiePie, but that wasn't the actual goal. That was just means to an end.

The end was to show advertisers at large that their ads are being shown with this kind of "politically incorrect" content.

And it's working so well it's actually scary. The Old Media is telling big corporations to pull their ads from YouTube, and they are complying. Dozens and dozens of megacorporations are doing so, and YouTube is in a crisis.

If it continues like this, YouTube is going down. Exactly what the Old Media, the establishment, wants. That's their goal all along. The Old Media feels threatened by YouTube, and they are killing it with everything they have. And it seems to be working.

Saturday, March 25, 2017

Orwell's 1984 is becoming more and more true

The most recent terrorist attack in London is showing, once again, how Orwell's vision, and warning, of a totalitarian society is becoming more and more true.

The slogan of "The Party" in his book Nineteen Eighty-Four is: War is Peace. Freedom is slavery. Ignorance is strength.

Ignorance is strength. And cowardice is courage.

What is, once again, the cookie-cutter message that the politicians of England, and everywhere else, are giving to the citizens of London? That this terrorist attack will not make them fall into despair, that they will not let fear overcome them, and that they will go on with their lives with courage.

Or, in other words, that the citizens should just ignore the problem and pretend that it doesn't exist. That they should put their fingers into their ears and cover their eyes, and just keep going on like before, like nothing has happened. In other words, the coward's way: When there is a problem, just pretend it doesn't exist, and maybe it will go away. And this is, somehow, "courage".

It's actually frightening how much this rhetoric resembles the propaganda of a totalitarian regime set on making the populace submissive and subservient through psychological manipulation: Do not question the establishment. Do not protest. Everything is fine and as it should be. You should ignore all perceived problems, and keep living your life as before. You should keep serving the system.

The subservient attitude is enforced primarily via shaming: If you protest, if you raise against the establishment, if you go against the government and the system, you are a racist, a bigot, a fascist, an islamophobe, a deplorable person. You wouldn't want to be one of those undesirables, now would you? You wouldn't want to become a persona non grata, would you? Unless you want to be one of them, you should remain docile and quiet, and just keep living your life as before, serving the system, and never raising your voice in protest. You should pretend that nothing is wrong, and ignore the problems.

As more and more people are starting to fight back, and protest, and demand that the problems be solved, when the shaming tactics are becoming less and less effective, the next step is to criminalize wrongthink. If you protest the establishment, if you go against the government narrative, you will be put in jail. Expect a visit from the police.

This is not just hypothetical rhetoric. It's happening more and more frequently. Country after country is passing law after law criminalizing dissenting opinion. Canada's passing of the law criminalizing "islamophobia" is but just the most recent example. And undoubtedly it's going to become more and more common.

If shaming isn't working, the populace needs to be forced into submission by law. The west will become islamic, by force if necessary. And why? Why are all the governments, and the media, so set on making the west islamic? That's a mystery that might never be answered.

As I see it, there are only two possible outcomes for this situation: Country after country will become a totalitarian islamic regime, where women will be oppressed and homosexuals thrown from rooftops, or the far right will raise and the west will become Nazi Germany version 2. I see no other possibility. Neither future looks very bright.

Thursday, March 23, 2017

Stop pre-ordering games; it's better for everybody

Back when the internet was a commodity that only very few people had, and buying games online was something that nobody had yet even dreamed of yet, and thus all video games were sold as physical discs in stores, it sometimes might have made sense to pre-order certain games.

After all, there were only so many physical discs that a video game store could have in stock at one time. If a game was expected to be immensely popular and you really, really wanted to get it on day one, pre-ordering the game made sure that you would indeed get it. While most other people were walking out of the store empty-handed because the game was sold-out, you would simply walk into the store and ask for your copy of it, which you had reserved. Or even have one sent to you by mail. People who didn't pre-order the game would need to wait for days or even weeks before getting their copy.

Not that this scenario was very common, but I suppose that in some cases it could have been.

Since every single game nowadays is sold online (some of them even exclusively online), pre-ordering doesn't make any sense anymore. If you want the game on day one, you'll get it. Just go to Steam, or whatever platform you are using, and buy it. It's that simple. It's not like they will run out of copies to sell.

It would make more sense if there were some kind of good incentive to pre-order. A great incentive would be to get it cheaper. For example, normal price for the game is 59.90€, but if you pre-order, you get it for 49.90€. A whole 10€ saving is a good deal for a brand new game.

But no. That is almost never the case. I don't remember seeing pre-orders being cheaper ever (there might have been some cases, but they are really, really rare.) You might get some completely inconsequential in-game stuff with a pre-order, but that's it. Most often what you get is completely useless and not really worth it. (In a few cases it might, quite ironically, even be detrimental for your playing experience. Such as getting a load of in-game currency, or some high-powered weapons from the start, which in some games would lower the challenge of the game.)

There are many reasons not to pre-order.

One is, of course, that you are buying blindly a product without knowing if it's good or not. Too many times have games been hyped to the heavens, and tons of people have pre-ordered them, and then it turned out that the game spectacularly fails to meet those expectations, and is considered if not an outright commercial disaster, at least bland and average, and in no way worthy of the hype. In other words, something that most people might be willing to buy if heavily discounted, but not at full price.

The other reasons are subtler.

Pre-orders entice game developers to rush the publication of their games. (Or, more precisely, they don't give them much incentive to polish their games before publication.) More and more games are published unpolished, full of defects and bugs.

In most cases these defects and bugs are fixed with subsequent patches, which is great and all, but the problem is that you, as a first-day purchaser, are experiencing the first buggy version of the game. Things might not look like they should, the game might not function properly, and there may even be crashes and other problems. You are experienced a flawed version of the game. People who buy the game weeks or months later will be experiencing a better version of it.

Do you really want to experience a buggy version of the game, while others are experiencing a much better version of it?

If people stopped pre-ordering games, and rushing to buy them as soon as possible, then perhaps game developers would make more sure that their product is as flawless as possible, so as to not get crushing reviews from critics, which would discourage people from buying.

Monday, March 20, 2017

Glamour Magazine and the myth of the pay gap

There's this really hilarious video made by Glamour Magazine called "Confronting the Pay Gap: Two Sales Executives Compare Salaries." It's about a white man and a black woman in similar jobs talking about the "pay gap", and how women, especially black women, are disadvantaged and paid less than white men.

Firstly, just look at them:


The woman is laid back, comfortable, and confident. Her posture, demeanor and manner of speaking is that of a person who is in control, and knows it. Leaning back, open arms, relaxed posture. She is confident, she is in control

The man's posture, however, speaks volumes. He looks extremely meek and timid. He looks like a person who has been beaten into submission, if not physically, at least psychologically. Hands closed, legs closed, crouching posture. He's not leaning back, but instead looks like he's subconsciously cowering from a threat. He looks uncomfortable, timid and almost fearful. He is subconsciously closing himself and protecting himself as much as possible, and making himself as small as possible as to not appear threatening or cocky. He exudes the exact opposite of confidence. He almost looks like a slave who has been beaten into submission by his master. And his manner of speaking doesn't dispel this picture. He is pretty much the polar opposite of the woman.

And no, I didn't pick an exact moment where he was in that posture eg. because he was shifting positions or something. He really sits like that for the entirety of the video.

While there are people, including men, who are this meek innately, and have been so their entire lives, it's also possible that this particular man has been made so by his environment (perhaps by an overbearing mother, by his surrounding society, by his school, or by feminists, or any combination of them.) He seems to be the embodiment of what's commonly and derisively called a "beta male". He doesn't seem to have an inkling if masculinity in him, in good or bad. It really looks like he has been beaten into submission, at least psychologically.

Anyway, the conversation goes on about how white men are privileged and how they get better salaries, and how there's a pay gap between men and women, especially black women. Then comes the big reveal: They write their yearly salaries on pieces of cardboard, and reveal them at the same time. This is supposed to be really telling. But what happens?


She actually makes $20 thousand a year more than him. For pretty much the same job.

So is this a big wake-up call for both of them? Do they realize that maybe, maybe, it's not that simple? That it's not always so that white men are paid more than black women for the same job?

No, of course not. This is not a video about dismantling the notion. No. They actually keep talking as if the man is still more privileged than her, somehow. Even though he makes $20 thousand less than her per year. For pretty much the same job. They still keep talking as if she were paid less than him, even though they just revealed the exact opposite.

Yes, as incredible as it sounds. It's some kind of bizarro universe, where big is small, and small is big, and nothing makes sense.

I actually feel pity for the poor guy. He has clearly some psychological issues, he has clearly been beaten into submission, and he's even being paid less for the same job than other people, yet he still has been brainwashed into believing that it's him who is the "privileged" one. It's like pouring salt into his wounds, into his already troubled life, and him just accepting it. This is Stockholm Syndrome level of insanity.

Saturday, March 18, 2017

A new form of silencing negative video game reviews

There have been a few absolutely infamous cases of astonishingly shoddy game developers trying to silence negative criticism of their video games by trying to abuse copyright laws. In other words, the developer tries to remove all negative criticism from the internet by claiming copyright on the material and, of course, abusing DMCA takedowns.

In some cases it has gone to astonishing lengths, such as the case of Digital Homicide pursuing a year-long lawsuit against the video game critic Jim Sterling, for the sole reason that he made negative review videos about their games. The whole process was really long and really ugly, with Digital Homicide, piling up claim after claim, and after they were all dismissed by the court, the main owner of the company transforming it into a personal slander lawsuit, again piling up claim after claim. After about a year he just gave up, and the lawsuit obviously didn't achieve anything.

There's also a very similar copyright/slander/whatever lawsuit made by another company against the YouTube critics h3h3 Productions, and it's equally ugly, nasty, and nonsensical. And these are by far not the only two cases, just the most prominent ones (there have been many, many others, but they have often ended much sooner. All of them in favor of the accused, ie. the video game critic, of course.)

The crucial thing is, of course, that copyright law fully protects critics. Critics have the fundamental right to criticism, and using footage from the work being criticized is explicitly considered fair use.

That doesn't stop some people from trying anyway. Especially on YouTube DMCA is so astonishingly easy to exploit. The only thing that the IP owner has to do is to fill up some online form, and like by magic the video is down.

The thing about YouTube DMCA, however, is that it has a somewhat fair countermeasure (required by law): The person who made the video can make a counter-claim, and have his video automatically restored. In this case the only recourse that the IP owners have is to take the case to court, if they think their DMCA claim was legit. In 99% of cases they don't.

However, a game company has now discovered a new tactic to take negative reviews of their games down on YouTube, which bypasses the DMCA mechanism (and, curiously, they used it against Jim Sterling; seems like he just can't get a break): Rather than use copyright, they use trademarks instead.

YouTube has no policies with regards to trademarks. They are not required by law, or by any statutes, to react to trademark claims, and they have no mechanism to deal with such claims. YouTube could just as well ignore any "trademark claims" they receive with no repercussions, because they are not obligated by any law to react.

However, it seems that YouTube's policy is to react to companies sending them takedown requests based on trademarks. A company sends them a message that "this-and-this video breaks our trademarks", and YouTube will take the video down. Apparently manually. There is no automated system for this.

Likewise there is no countermeasure that the creator of the video can take. There is no system to make any sort of "counter-claim", like with DMCA. The video is down for good. Complaining to YouTube isn't going to help; they won't do anything.

The only recourse that the creator of the video has is to take the company to court, win the case, and then send YouTube the court's decision and hope that YouTube restores the video. And if they don't, the only recourse remaining is to ask the court to demand YouTube to restore the video.

99.99% of people, heck, maybe 100% of people, aren't going to do that. They aren't going to take a corporation to court over a single video, because of a spurious trademark claim, hope they will win, and then on top of that possibly try to issue a legal restoration demand to YouTube.

So it's a perfect scheme, really. It seems that where copyright claims fail, trademark claims are the absolutely perfect weapon to take down negative YouTube video reviews for good. Even though YouTube has no legal obligation to obey such trademark claims, they do. And there's pretty much nothing that the video creator can do about it, other than go through a hopeless and endless legal battle, just to have one video restored.

Wednesday, March 15, 2017

What does it mean for a PC/console to be "4k"?

As I have written previously, there is a lot of confusion out there about whether the PlayStation 4 Pro supports running games at native 4k resolution (ie. 3840x2160 pixels), or whether it always renders games at a lower resolution and upscales for the 4k display. I think that much of the blame on this confusion can be put on Sony, who haven't made it clear enough in their promotional material.

The answer is, of course, that yes, the PS4 Pro can render games at native 4k, without upscaling. This, however, is a choice made by the games (or, more precisely, the game developers) themselves. Games can choose the resolution at which they will be rendered on the PS4 Pro, and 3840x2160 is a perfectly valid option, and there are already several games with this support.

The confusion probably stems from the largely advertised property of the PS4 Pro that it will render existing games (ie. those made for the base PS4 model before the Pro was published, and which have not been patched with higher resolution support) at a lower resolution, and upscale the result for the 4k display. This has given many people the wrong impression that it will do the same with all games, even new ones, no matter what.

The misconception is, somehow, really prevalent, and can go all the way up to people who really should know better, such as video game reviewers, and gaming company representatives. Also, some PC gamers who are eager to deride and attack consoles and their users, fully embrace the misconception to make fun of the PS4 Pro and claim that it's not "really 4k". It's actually quite egregious the extents they will often go to make that claim.

In an online conversation I corrected someone's direct claim that the console can't run games at native 4k. Regardless of my correction, he insisted for several messages that I was the one who's wrong, and that the PS4 Pro will always render games at a lower resolution and upscale. It took several posts with links to articles and pictures to convince him that it does support native 4k without upscaling.

But, of course, rather than admitting his error and his misconception, he then started moving the goalposts, and coming up with other requirements for a system to be "true 4k".

The first one that everybody always presents is, of course, that it can't run those games at 60 frames per second when in 4k resolution. As if that were somehow a requirement for something to be "true 4k". I have never heard of such a requirement anywhere. It's a completely made-up one.

But even that claim is false. There's nothing stopping the console from running a game at native 4k resolution at 60 FPS. The choice is made by the game developers: Do they prefer framerates, or graphics? The console, or Sony, is hardly to blame if game developers go for graphics rather than framerate. And, as of writing this, there is at least one game that does run at 4k@60FPS: FIFA 2017. So it's not like it's impossible.

Of course now the goalposts are moved once again and, apparently, if it can't run all games at native 4k and 60 FPS, then it's not "true 4k". Apparently, if there exists even one single game that it can't run like that, then it's not "true 4k". The requirements just keep piling up. Of course this requirement pretty much means that there exists (and will probably never exist) no platform that's "true 4k" in this sense. From the millions of games out there I'm certain you can always find at least one that the system can't run at native 4k at 60 FPS, no matter how beefy your computer may be.

It all comes down to the definition of "a 4k system". Some PC gamers are eager to pile up requirement after requirement just to make the PS4 Pro not "true 4k" (while at the same time making pretty much all PCs, no matter how powerful, likewise not "true 4k"), but that's not really a reasonable way of defining the concept.

The simplest and most reasonable definition of a 4k system is if it has support for 4k displays (ie. 3840x2160 pixel displays), and can show content using that resolution.

After all, this hasn't always been so even on the PC side. 4k support is actually surprisingly recent. For instance, the first Nvidia graphics cards that had support for 4k resolutions was the GTX 600 series, published in 2012. In fact, pretty much no computer (PC or Mac) nor console had 4k support prior to 2012 (except, possibly, some early prototypes).

If a computer (or console) supports a 4k display, and can show content using that resolution (eg. video material, or anything else), it pretty much means that it has 4k support, plain and simple. Frame rates have absolutely nothing to do with this. Graphical quality has absolutely nothing to do with this. There is no standard by which 60 FPS is somehow a requirement for something to be "true 4k". That's completely arbitrary. (And even then, as said, the PS4 Pro can run a game at that refresh rate. It's up to the game developers whether they want it or not.)

On the PC side there's all this "is your PC 4k ready?" thingie. Which is completely and absolutely arbitrary. The minimum requirements for "4k readiness" are a complete ass-pull. For example the benchmark score thresholds for a "4k PC" in benchmarking software are completely arbitrary, and constantly changing. It's up to somebody's opinion. There is no standard for this.

The 3840x2160 resolution is a completely unambiguous absolute form of measurement. Can your system output to a display of that resolution, showing content with that many pixels? If yes, then it has 4k support. Easy, simple, unambiguous.

Friday, March 10, 2017

Do not consider YouTube permanent storage for your videos

Many people seem to think that YouTube is a good free service to permanently store their videos. Small PSA: Don't!

YouTube has no backup system for your videos. If you, for instance, delete one of your videos from YouTube, then it's permanently gone. If you don't have a copy of it somewhere else, you will have lost it forever.

Why is this important? There have been several cases of people having their YouTube account hacked, and all of their videos deleted. Hundreds of them. *Poof* Gone forever. Permanently out of existence. No possibility of retrieval.

It is understandable that not many people have enormous amounts of disk space, especially if they are prolific with their video creation (or they make really long videos). But do not consider YouTube a safe storage space for them, especially if the videos are important, or your livelihood depends on them. (After all, many people get income from YouTube ad revenue. Some people even make a living on it.) They could be all gone at any moment, in the blink of an eye.

Why "workers' parties" seldom drive the interests of the working class

In many, perhaps most, western democracies there almost invariably is at least one, and often several "workers'" political parties. Quite often at least one of them will have "workers" or "labor" in their official name to indicate this. Yet, in the modern world, it seems very rare that such parties actually promote the interests of the working class, who they supposedly represent. Too many times they pass laws and make decisions that harms the working class, and is protested by them.

But why? I have a theory.

Workers' parties usually start from within the working class, to represent their interests in parliament. They quite often tend to become very big parties because the working class is invariably by far the largest class in society, and thus they tend to vote for the party that promotes their interests the most. The representatives are from that same class themselves.

However, once the party becomes big enough, it usually means that the politicians and the parliamentarians start getting hefty salaries. They often stop being part of the working class, and become upper middle-class. Some of them even outright rich upper class (ie. "aristocrats").

Likewise as time passes and new politicians join the party, these new people might themselves not be working class at all, but middle-class already to begin with. Or even richer.

Even those who started as part of the poorer working class, now with hefty incomes, will move to the richer neighborhoods, where all the upper class is living.

Which, of course, means that these politicians are not surrounded by the everyday lives of the working class anymore. They are completely detached from the very people they are supposed to represent. Instead, they will be surrounded by rich people, the upper class, whose lives and needs are very, very different from that of the working class.

Don't get me wrong. In my view rich people can be rich. I have honestly and literally zero jealousy against them. I have absolutely nothing against them. Good for them, especially if they have earned it with their work and talent (but even if they have just inherited their riches, so be it; it's all ok in my books.)

What I am, however, trying to point out is that rich people, living in their rich neighborhoods, do not experience nor understand the struggles of the working class. The working class could just as well live on the other side of the planet, effectively, from their point of view.

And now, the politicians who are supposed to represent the working class in the government, are detached from that very class, and do not see nor experience what they do. It's no wonder that they become detached, and start acting against the interest of people they don't know nor understand. They only see the rich people around them, so it's logical that their thinking and attitude will be shaped accordingly.

If a certain ideology becomes popular among the rich people (such as virtue-signaling with immigration-friendly attitudes), the politicians will also likely share that ideology. Even if the working class doesn't. Thus the party that's supposed to represent them will act against their opinion.

A generation of sheltered spoiled brats

How do we learn social skills, and how to act with other people in social interactions? How do we learn to act in a manner that makes us socially acceptable, and even pleasant to interact with?

In part, by us being taught by our parents. However, that's only part of it. Perhaps the smallest part. The most important way is by learning from our mistakes. By us not knowing, because of lack of experience, how to act in a given situation, acting badly, ie. making a mistake in our behavior, and either making a fool of ourselves or sometimes even hurting someone, and then realizing the consequences of it, and learning from it.

An important part of this process is other people telling us of our mistake. Criticizing us. Pointing out where we went wrong. When this happens, especially if it's a really bad mistake, it can be very shameful and embarrassing, and it can even hurt. But that's the process of learning. We learn from our mistakes, and know how to do better next time.

However, what happens when somebody grows up in an extremely sheltered environment, isolated from the larger population, almost never having any social interactions outside of a minuscule inner circle, and never having any of his or her actions being questions nor criticized, and learning that they will always get whatever they want? What happens when they never get their feelings hurt by being reprimanded and their mistakes being pointed out and criticized?

A spoiled brat, that's what.

Spoiled brats, if they grow to adulthood thus sheltered, often will have a very hard time in life. They will be socially awkward and, quite often, will be extremely egotistical, demanding, obnoxious and, in general, be very unpleasant people who nobody likes to be friends with, because they have lived their entire lives without learning proper social skills, and having everything they wanted. When suddenly they get to the real world and people don't actually do everything they want, nor give them everything they want, they will get upset and throw tantrums. And adult tantrums can be really destructive.

Oftentimes such spoiled brats will only suffer themselves because of their lack of social skills. However, if they are more powerful (eg. because of having been born in a very rich and powerful family), they can make the lives of other people a living hell.

But what happens if an entire generation of people grows up as spoiled brats? What happens if the majority of people have lived sheltered lives where they have never been criticized nor reprimanded for their mistakes, and have learned to get everything they want?

This is just not hypothetical. It appears that the current society, at large, is trying to make the next generation like that.

Generations to come are internet generations, for good or bad (I'd say mostly for bad). They will be people whose majority of social interactions will happen through the internet, rather than real life.

A free and open internet can be a harsh place. It's an open marketplace of ideas, where you can find all sorts of opinions and viewpoints, all kinds of political stances, and all kinds of claims that can be true or false.

Openness is in itself a relatively good thing overall. People should be free to express any opinion they like, and have the opportunity to hear other people's opinions.

However, current society at large is trying its hardest to change that. This... what could we call it... quasi-organic entity composed of megacorporations, governments, politicians, international organizations, activist groups, and so on and so forth, is trying its hardest to make the internet essentially an echo-chamber for each group of people with different ideas.

"Wrong" political opinions are being silently censored and shadow-banned (which means that the poster of the opinion doesn't see that his opinion has been hidden from everybody else). This is happening on social media platforms like Twitter and Facebook, and increasingly everywhere else.

Moreover, the shadow-banning technique is incrementally being taken further and further. Rather than outright censoring people (which seldom works), what they are starting to do is isolating people with differing opinions into their own inner groups, often without them even noticing. In other words, what they are doing is to not actually shadow-ban people's opinions entirely, but shadow-ban them only from people who have different opinions from theirs, while still showing it to like-minded people. This is a significantly more devious tactic because now people will be getting feedback to their posts, giving them the false impression that their post is open for everybody to read, when in fact it's only being seen by other likeminded people.

The technology to do this automatically is still in its infancy, but it's already being developed and used in some social media platforms (including many that are owned by Google Inc.) They are developing learning AI algorithms that will detect and categorize the kind of opinions that people express in their posts, and thus categorize users into groups based on which kind of posts they make and follow. There are already social media platforms that will "redirect" posts that are "off-topic" (which in practice means posts with dissenting opinions) to their "proper" groups. Or, in other words, isolating people into their own echo-chambers based on their opinions.

As said, this is much more devious than simply removing such posts, or even shadow-banning them.

Governments are, of course, all for this kind of thing. More and more governments are pushing for censorship of "fake news" and "hate speech". Of course "hate speech" is a broad concept and can be, and is, used as an excuse to vilify simply dissenting political opinions and criticism. It doesn't consist solely of eg. actually racist agitation, but also things like criticism of immigration policies, or any sort of political opinion that, in our modern world, is classified as "right-wing".

If these megacorporations and governments succeed in making the internet consist of isolated groups of echo-chambers, we will see more and more a generation of spoiled brats that have never encountered a differing opinion, have never encountered criticism, and have never been exposed to ideas that differ from their own, and have instead been reinforced their whole lives with the same narrow ideas.

What happens when these sheltered spoiled brats become politicians, parliamentarian representatives, heads of state, judges, the police, CEOs, and so on? What happens when an entire generation of people will throw temper tantrums when other people don't agree with them, nor immediately give them what they want? What happens when these spoiled brats are people in power, people with actual power to seriously hurt the lives of other people?

Thursday, March 9, 2017

Can a PC offer console performance at the same price?

There are plenty of YouTube videos (and of course web pages) out there where people engage in the task of building a PC at the same price as a console (currently the PS4 Pro being of course the most popular comparison) to see if the PC offers the same performance, or even better. Almost invariably, it does.

However, these projects are often deceptive because they are ignoring hidden costs.

Almost invariably they will have a budget of approximately the price of the console... and will spend the entire budget on the base PC hardware and that's it. In other words, the motherboard, PSU, CPU, GPU, RAM, a hard drive, and the PC case. Then they test its performance at playing games, and compare it to the same games on the console.

However, this is deceptive. They are actually spending more money on the hardware than they should, and are ignoring the hidden costs that you would need to pay if you actually wanted to build such a PC to play games. They are also not actually offering all the same capabilities as the console.

For starters, the console always comes with a controller. I have yet to see the price of a controller added to the PC in any of these comparison videos. If they wanted to make the comparison fair, they would actually need to use the actual controller of the console (ie. PS4 or Xbox One). Incidentally, both controllers are usable on the PC, and can be used to play games. And it's not exactly cheap. If you want the same controller, that adds at least 50€ to the overall price, at minimum. That's quite a big chunk of the overall budget (when are talking about a total budget of about 400€ or whatever).

However, more importantly, I have yet to see the price of Windows being taken into account in the budget.

Sure, sure, you could theoretically just run Linux in your budget PC, and be able to play some games with it. However, let's face it, if you want a real gaming PC, you just need Windows in it. There's no way around it. Not only does Linux support only a fraction of PC games, it unfortunately also runs them less efficiently (most probably because the GPU manufacturers can't be bothered to create optimized graphics drivers for Linux.) Unfortunately Windows is just a practical necessity, if you want a real gaming PC, and there's no way around it. You have to add that to the cost if you are being honest.

And Windows isn't exactly cheap. For example here Windows 10 Home costs about 135€. Even the OEM version isn't much cheaper (only like 10€ or so cheaper.) It's really expensive. If you are doing this kind of comparison project, and you are being honest, you really have to allocate that sum in your budget.

And suddenly you find out that almost 200€ of your budget goes to the OS and the controller. That's like half of the entire budget right there. You now only have a bit over 200€ for the actual hardware. Suddenly the comparison isn't looking all that good after all.

So actually no, I'm not convinced that you can actually buy a PC with comparable specs that has the same performance as the console, at the same price. Not if you include everything that you would need in order to actually play games, and using the same peripherals.

Is Intel engaging in planned obsolescence?

I have an i5-2500K CPU in my PC. This particular CPU is rather famous among PC enthusiasts in that it's amazingly overclockable. Its official base clock rate is 3.3 GHz, with an official maximum boost frequency of 3.7 GHz (which is a technique supported by more modern Intel processors where, if enabled, the CPU will automatically and dynamically "overclock" itself depending on the current load.)

However, with proper cooling, people report the CPU being completely safe to be overclocked even up to 4.6 GHz without problems, which is rather astonishing.

I have an efficient CPU heatsink by CoolerMaster, which keeps the CPU incredibly cool even under full load. With this heatsink, without overclocking, the CPU stays, when idle, at about 35°C and even under. At full load (all four cores at maximum load) the temperature is under 55°C, which is quite remarkable.

I have overclocked the CPU to 4.2 GHz, and under full load the temperature is about 63°C, which is still well within completely safe limits.

A friend of mine has a Xeon E3-1281 v3, and he uses the exact same heatsink as I do. His CPU is not overclocked, yet it reaches temperatures of even 80°C when under full load. We have been pretty much unable to determine what the problem may be. However, I have a theory about this.

The main reason why CPU's have a limit on clock speed is temperature. (There are, of course, other factors at play, but temperature is, by far, the major one.) When the CPU is overclocked, its temperature under load inevitably increases, and it can only take so much before it breaks (although with basically all modern processors, before it throttles down to avoid breaking.)

One of the reasons cited on why the i5-2500K is so famously overclockable is that it has a really well-crafted heat transmission between the CPU die itself, and the heat spreader lid that's on the CPU chip (which is what then makes contact with the heatsink; or, in this case, the heatpipes of the heatsink.) This allows the heat from the CPU die to be efficiently transmitted to the heatsink, keeping the CPU relatively cool even under heavy loads and massive overclocking. With an efficient heavy-duty heatsink this means that astonishing overclocking (such as from 3.3 GHz to 4.6 GHz) becomes possible.

Many people have noticed, however, that more recently Intel processors not only run hotter, but also cannot be so massively overclocked, no matter how efficient the heatsink may be. Even if you have a really, really massive heatsink made of the highest quality materials, it doesn't help much. Moreover, the CPU seem to get worse and worse in this regard in just a couple of years, as if its heat dissipation capabilities degraded over time.

And the reason for this becomes clearer when the CPU heat spreader lid is removed: For some reason Intel has started using really cheap thermal paste under the lid (ie. between the CPU die and the lid). There are videos on YouTube about people doing this to a modern Intel CPU that is running really hot, and discovered that the thermal paste inside there has pretty much solidified and lost most of its heat transmission properties. This is something that happens with really cheap, low-quality thermal paste. When the junk paste is removed and replaced with fresh high quality thermal paste, the temperature goes way down.

Why would Intel suddenly start doing this? Even the highest quality thermal pastes out there are relatively cheap. We are talking about something like less than a US dollar per CPU. These are CPUs that cost from 200 to over 500 US dollars. One meager dollar more isn't going to change that by much.

My theory (and I'm not the only one) is that this is planned obsolescence by Intel. They are deliberately putting low-quality thermal paste in their CPUs so that they will become unusable in a few years, forcing the majority of people to then buy a new one.

The i5-2500K was, possibly, one of the last CPUs that was explicitly designed by Intel to be highly overclockable and durable, thanks to a fantastic high-quality heat transmission between the CPU die and the heat spreader lid. After that it seems that they changed policies and went for a really, really underhanded anti-consumer tactic of planned obsolescence.

Wednesday, March 8, 2017

Square Enix hates turn-based combat

Strict turn-based combat has traditionally been one of the staples of JPRGs. This means that combat consists, effectively, of the player and the enemy/enemies taking turns to attack each other (or do other actions). During your turn, the action effectively pauses, and you have all the time in the world to select an action (most traditionally from a menu, increasingly by other means the more modern the game).

Some games and game franchises still use this traditional form of combat, one of the most quintessential examples being the main Pokémon games. However, this has become rarer and rarer with time, especially from the 2000's forward. Game companies, who have traditionally made, and still make JPRGs have tried all kinds of other combat systems, almost invariably bringing it closer and closer to real-time combat with actions bound to controller button presses, rather than being selected from some kind of menu. The combat system has also become increasingly less turn-based and more real-time, in that the combat does not pause at any point, and the combatants don't take turns.

Square Enix seems to be one of these companies. Both Square and Enix (when they still were separate independent companies) made some of the most famous and most popular JRPGs of all time, starting in the late 80's, and going very strong for the entirety of the 90's, and well into the 2000's. Basically all of them used strict turn-based and menu-based combat, with very few exceptions.

At some point, however, Square started moving away from turn-based combat systems. I think Final Fantasy IX was their last game that used that system, and Final Fantasy X, while still using a menu system, started moving away from the traditional system and more towards real-time combat.

While not exactly the first time ever to do that, Square Enix JRPGs nevertheless started emphasizing the real-time aspect of combat, and de-emphasizing the strictly turn-based system of old. While still selecting actions from a menu, the combat doesn't actually pause, and you have to be expedient about it (although some games may have an option to have the combat pause when it's your "turn".)

Many spin-off games, such as Crisis Core, got rid of the entire menu system completely.

The same is true for Square Enix's latest game in the main franchise, Final Fantasy XV: The combat system is completely real-time, controller-button bound, with no turns and no menus of any kind. The combat system is more akin to a so-called "spectacle fighter" than a traditional JRPG.

A couple of years ago they announced that they will be creating a complete remake of Final Fantasy VII, one of their most successful and popular games of the series. The original game used strictly turn-based and menu-driven combat. Apparently the remake will have none of that, and will be purely real-time spectacle-fighting, like FF XV. This is quite a disappointment to many.

Many people, including me, like the old turn-based combat system of so many JPRGs of the 90's. We miss it. It's not used in almost any game anymore (Pokémon being perhaps the only exception).

I find the real-time spectacle fighting system boring. It's just meh. There's something charming and relaxing about the turn-based menu system combat. But it seems that Square Enix have decided that they will never make another game like that. Almost no game company is making such games anymore (with the exception of indie developers creating traditional JPRGs with RPG Maker.)

"Women's Day" shouldn't be a thing

Nor should "Men's Day", or any kind of "Day" celebration dedicated to a group of people based on their gender, race, sexual orientation, or any such inconsequential characteristics.

Consider this description that, for example, Facebook gives of Women's Day: "Let's celebrate the amazing contributions women make to our world and our future."

"Women" is taken as a homogeneous group, and contributions made by some of them are attributed to all of them. Like they were some kind of hive mind that acts in unison.

This is such a collectivist way of thinking. As a devout individualist, this kind of thinking clashes badly with my principles.

As an individualist, I strongly endorse the rights of the individual, as well as treating and judging people based on their personal merits, qualifications, achievements, actions, personality and content of character, regardless of what their gender, race or sexual orientation may be.

If an individual person has made a significant contribution to society and humanity, that may well be worthy of appreciation and respect. However, taking a physical inconsequential characteristic of said person, and using that to give credit for that contribution to all people who happen to share that same characteristic, is abhorrent.

If there is a woman who has made a great contribution to society and humanity, I'm all for recognizing and celebrating that. But not because she is a woman, but because she is a person who has made such a contribution. It doesn't really matter whether she is a woman, a man, or anything else. What matters what she has done, not what she is.

Special treatment (usually special services) should only be granted to groups of people if there is a scientifically provable medical or biological reason for it, for practical and pragmatic reasons, but that's about it. Other than physical necessity, people should always be treated as individuals, based on their personal merits, not based on what they are.

Monday, March 6, 2017

Why trademarks are a good thing

A trademark is a legal protection that a person or corporation can acquire for a certain distinctive brand name as well as distinctive brand image characteristics (such as certain styles, fonts, shapes, colors, etc.) when used in a certain context. The purpose of trademarks is to protect the owner from counterfeit products that are deliberately made to look the same, or very similar, for the intent of being mistaken for the original product.

A product name, when used in a certain context, can be trademarked. (For example, the name of a detergent can be protected in this way, when used in detergent containers. Or the name of a beverage, when used in soda cans.) Moreover, certain styles, shapes, fonts, etc. can be trademarked ("trade dress") when used in such a context. For example, not only the name itself of a beverage may be trademarked, but if it's printed with a very distinctive way, like using a certain font and coloring, that can be trademarked as well (so that similarly named products cannot use the same style to make it easier to confuse the consumer and make them mistake it for the original.)

Many people think that this is just corporations being overly protective and greedy of their property, opportunistically suing people for even the slightest of random resemblance with some product of theirs. However, trademarks actually benefit the consumers too, and this is something that most people don't realize.

When corporations strongly protect their trademarks, quickly and efficiently removing all counterfeits and knock-off products from the market, this creates an actually desirable situation for the consumers. This is because when you buy a product, you can be sure of who made it. You are not buying a product blindly from an unknown source, but you can be assured that it has been made by a proper corporation obeying the safety laws and regulations of your country, and who can be held properly responsible if there's something wrong with the product.

Suppose you are buying a beverage, for instance. Wouldn't you prefer it was made by a known company that obeys the safety regulations of your country, and who will be held responsible if something goes wrong? Rather than, you know, buying a beverage manufactured by who knows what entity, who knows where, under who knows circumstances.

When you see a familiar logo or name on the beverage can, you can be certain that it's almost certainly safe to consume. That's because the company that manufactures it is so strict about their trademarks and fights off any knockoff counterfeit products made who-knows-where that might attempt entering the market.

It's not always about health safety either. If you are buying, for instance, some kind of device, you can be certain that it has a warranty that will protect you if the product is defective (at least in most countries). If, however, you were duped into buying a counterfeit version of the product, and it turns out to be defective (or even dangerous), well, you simply lost your money. There is no warranty.

That's why trademarks, and corporations fiercely protecting them, are a good thing, even for the consumers.

Wednesday, March 1, 2017

The media's job is to control what people think

This is a rather marvelous, and hilarious, excerpt from a news cast by MSNBC. It ends with the news anchor saying:
"... (Trump) could have undermined the messaging so much that he can actually control exactly what people think, and that is our job."
Sure, sure, we can give her the benefit of the doubt and assume she didn't actually intend to say it like that. However, it does work beautifully as a Freudian slip.

Console games do not support mouse&keyboard

Game consoles can be quite convenient for many people, as they are much less hassle to use and to play games than a PC. They are pretty much "plug&play". They also tend to be cheaper than a PC with comparable capabilities and accessories. Oftentimes they also have pretty good exclusive games, which isn't something to dismiss lightly. For these and many other reasons consoles are not just for casual gamers, but also much more "hard-core" gamers (even those who are also avid PC players).

There is one thing, however, that annoys me about consoles quite a lot: Games made for them deliberately lack support for keyboard&mouse controls.

These are often the exact same games that on the PC have full support for keyboard&mouse, or optionally a game controller. The exact same games on the consoles, however, deliberately lack support. There is no technical reason why they shouldn't support them. The hardware and the operating system supports them just fine. The games just don't, period.

But why? Who knows.

Some people argue that this gives an unfair advantage in multiplayer games. That doesn't explain why they can't be supported in single-player games and single-player campaign modes. You are not playing against anybody in these games and modes. There just is absolutely no reason to not support keyboard&mouse, especially since the PC version of the exact same game does have the support.

Sometimes this lack of support goes to baffling lengths. The web browser provided with the PS4 is cursor-based. You use a cursor just like normal, in order to browse, just like you would do on any PC. Except that the browser, for a reason that nobody can explain, has no mouse support. There is literally zero reason why it couldn't support the mouse. It just doesn't. For unknown reasons.

I can't understand why. The operating system of the console supports and detects any USB mouse connected to it just fine. However, almost no game, and no software supports it. (AFAIK there exist one or two obscure games for the PS4 that do support keyboard&mouse, but that's it. From the thousands and thousands of titles available for the system, only one or two have support.)

First-person shooters are much more comfortable and fluent to play with keyboard&mouse. For the consoles, the developers often need to add aids and anti-frustration measures to counter the limitations of the controller. But they won't enable keyboard&mouse. It's inexplicable.