My thoughts on my Oculus Rift DK1

Oculus Rift DK1

Oculus Rift DK1 (photo from Wikipedia)

I’ve had my Oculus Rift Developer Kit Version 1 (DK1) for just under a year, after receiving my kit on April 11, 2013. In that year, I’ve built a few apps and played with a ton of other people’s apps from Oculus Share. My experience with the DK1 is that, while it’s good, it’s not great. It’s funny, because, while the low resolution and heavy screen-door effect were the two initial problems I had with the unit, over time, they took a back seat to another, more basic problem:

The Oculus Rift DK1 wire is fucking annoying. Not just annoying, but a lot of the time it ruins the experience of immersing yourself in the virtual environment. The new term that people are using for this is “presence.” When I’m wearing it, I can’t turn around fully without feeling the wire tickle my neck or hear the breakout box slide across my desk, which makes me worry that it’ll fall off and I’ll break it, so I take the headset off to make sure it’s safe. The wire undoes exactly what the rest of the kit is trying so hard (and succeeding, mostly) to do: immerse me in the experience. All the time that I use the unit, I fear of fully moving in any direction because the wire is there.

That wire has got to go.

I know that Oculus is working its hardest to reduce the latency between the time that you move and the time it shows the movement on the screen in the headset. I know that going wireless will increase that latency. But, hot damn, at this point, I’m almost willing to take a slightly more delayed response if I can do without the wire.

Oculus, Facebook, Carmack, and Abrash

Oculus Rift DK2

Oculus Rift DK2

My first reaction, similar to that of most other developers who are working with the Oculus Rift, upon hearing of the Facebook acquisition of Oculus, was one of intense disappointment. It felt like our favourite band just sold out to a huge record label. Oculus was the embodiment of the VR industry itself: the scrappy little guy, fighting against all odds to prove to the world that he can do it.

All that changed this past week when it was announced that Facebook acquired Oculus.

Enough has been typed and said over the past week, with emotions ranging from “take our ball and go home” to “this is the best thing that could have happened to us.” After letting it settle, thinking about it, seeing John Carmack give his support, then Michael Abrash  leaving Valve to join the team, my feelings on it have completely changed. This change at Oculus is a big deal, in a good way. Oculus now has the best chance of making true VR a reality. They have the best team in the world and the biggest budget behind them to do it. Colour me excited.

Google Chrome not supported

Really? What year is this??

Also, Netscape?!

chrome-netscape

I’d just like to point out this is the city that is home to Google, Velo-city, University of Waterloo, Hyperdrive startup accelerator, BlackBerry, …

We can do better, Waterloo.

Detecting Jackwagons in Online Games

griefing

Reddit Syndrome, The Eternal September, et al.

Counter Strike Global Offensive (CS:GO), a game I have been playing often since summer of last year (2013), is currently facing a dilemma that all online multiplayer games (and many social networks) face: as it grows in popularity, which is required to grow the monetary kick-back for developing and running the service as well as pushing the service’s features forward, the average level of player maturity decreases in proportion, to the point where older players who are used to playing with a more mature player-base will flee the game for some other outlet until this process takes over that one, and so on. It’s important to note that I’m not speaking about the skill of CS:GO players, since that is handled quite well by their Elo ranking system, but instead the maturity level, which means things like the level of racist voice and text chat, lack of statesmanship, etc.

Paul Graham’s Hacker News experiment is an attempt to solve this problem on the social news side of things. He writes about several of his reasons behind the choices he has made while running the site.

I wonder…

Is there a way to programmatically ensure that higher-maturity players do not intersect with lower-maturity players while not specifically removing the lower-maturity players from the player-base, since those lower-maturity players are required to keep the service growing?

My idea is that the service would have two or more pools of players, which would be kept secret from the player-base. My supposition is that lower-maturity players are “high-churn” in that they will likely not stick with the game for a great length of time and will instead switch their attention to some new game that arrives 3-6 months later. This “high-churn” player-base would essentially subsidize the higher-maturity players and game without the higher-maturity players ever having to intersect in game-play with them.

How do you detect an asshole, in code?

My guess is that this will have to be done in a similar way to detecting email spaminess: users will have a value between 1 and 100 for assholery. Being an asshole in online forums such as games is not binary (being either true or false) nor can any one action or decider change your state to true or false. So, it will have to be a collection of actions, over a given space of time, which will increase or decrease your assholery value.

Counter Strike Global Offensive offers a way for players to report users for griefing which offers one opportunity, though I’m not sure how much weight to put on it since it could be easily gamed directly by the assholes we’re trying to prevent.

A manual process is, at first glance, out of the question, since it’s not scalable. Thousands of games are on at any given point in a day. How could you possibly oversee them to identify assholes? Here, Counter Strike Global Offensive offers us a unique idea: Overwatch. As a developer, this solution smells bad because it feels like something we should be able to automate.

Perhaps a combination of encouraging users to not act this way combined with an Overwatch-for-Assholes system would reduce it.

I don’t have an answer

This problem is not going away and will only get worse as the gaming population grows.

Discuss on Hacker News.

All that, for this.

Interview on NPR with John C. Inglis of the NSA:

While Inglis conceded in his NPR interview that at most one terrorist attack might have been foiled by NSA’s bulk collection of all American phone data – a case in San Diego that involved a money transfer from four men to al-Shabaab in Somalia – he described it as an “insurance policy” against future acts of terrorism.

(Source)

Emphasis mine.

How I WAS GOING to use git as a backup scheme for a popular Minecraft server

Update #1: Boo-urns. Github doesn’t support uploading repositories that are > 1GB in size.

It appears that, as much as I want it to be, git is just not the right tool for this job. Instead, I’ve picked up a “100GB” account at Google Drive and will share them there. It’s almost as good and offers the ability to download old revisions as well. I’ve released the world files under the Creative Commons Attribution-NonCommercial 4.0 International License.

Update #2: Put.io

Ask and ye shall receive. I’m still going to keep my files on Google Drive because I’ve already invested the time to put them there. But the next time I need to do this kind of thing, I’ll be looking at Put.io.

Here is the original post for history’s sake:

Hellblade Mobs Minecraft Server

Hellblade Mobs, my Minecraft server, has been in operation since November 2010. For the first 7 months, it was just me and a few friends, white-listed. One world, no hMod, no Bukkit, no plugins.

One May 24, 2011, that all changed: we went public.

Since then, thousands of players have come and gone, and the server still maintains a healthy buzz. We’ve got hundreds (if not thousands) of memories invested in these blocks, spread across eight worlds. It would be horrendous (and likely fatal for the server) if something happened and we lost it all.

Being a developer with a very crummy Internet connection (10Mbps down, 1Mbps up), I’ve always been nervous about this situation. The world files are GB in size and downloading them takes forever, not to mention the act of uploading them to a backup service like Dropbox. Every 6 months or so, I get so nervous that I begin the laborious task of FTPing my files down to my external drive and making a copy of them on an old PC in the hopes that I never have to use it. Halfway through the life of the server, we switched hosts to the fabulous Nuclear Fallout service (referral link!) and it took forever to do.

There has to be a better way!

A recent event with griefers has brought to light once again of making the server files public, which is something I’ve always wanted to do. I thought of the logistics: if only there were some service that I could upload new copies of the worlds over time, uploading just the diffs, and the files could be made for public download whether one wanted an old copy of just a tarball of the newest one… Instantly I thought: why can’t I just put the worlds on Github?

Turns out: I can. Git supposedly is not designed to handle repositories GB in size, but it seems the only effect this has is to slow things down a bit. Compared to the previous situation, I’ll take a 15-minute “add” command with no complaints, thank you very much.

If you’re into Minecraft, you’re more than welcome to come play with us! Connect to minecraft.xandorus.com.

On being secure

With all the recent news about the US government collecting and analyzing everything we do online and in our daily lives, we’ve all been looking for ways to increase our privacy.

Today, an article was posted on Hacker News about Google Analytics not being served over https. After reading this, I remembered that I use it and questioned whether or not I should keep it on this blog. Google Analytics has been installed on this blog for years, but today I found it hard to answer exactly why. It provides no real value to me other than satisfying my curiosity.

In the end, I decided to remove it. Not only because it is not served over https (since this blog currently is not either, though I am working to quickly remedy that), but because the only real parties it benefits are Google and the NSA. My site is not large or popular, but it’s just one less site on the network being tracked through that channel.

I believe, in life, we should lead by example. I believe the web should be secure by default. I believe web servers should only function when using encryption (Supporting http was a design flaw, https should have been the only option. Even a self-signed certificate is safer than plaintext http.)

To that end, I’ve come up with a short list of simple things us website owners can do in order to hinder attacks or snooping by third parties. I’ll compare my own site against this post and update as I move toward compliance (red means failure):

  1. Serve content only when encrypted by perfect forward secrecy.
  2. Serve content entirely from web hosts and CDNs under your control.
  3. Encourage others to do the same.

It’s amazing how quickly my view on this has changed. If you would have asked me a year ago whether or not it was important to self-host images and scripts used on your site (or whether you should even be hosting your blog yourself versus using a third-party service like Tumblr), I would have answered an emphatic no and provided many reasons why letting a bigger, better player handle that is much better.  As a site operator, I want my site to be as fast as possible. As a web user, I want to be as secure as possible. Which is more important?

With the way things are now, it’s worth being a second or two slower to serve knowing that your stuff is your own.

Quality Control

I saw a feel-good Amtrak post come up on the /newest section of Hacker News the other day which covered the new single-level long distance Amtrak cars being produced in the US. The first thing I saw when watching the video was the flag of The Netherlands painted across each of them.

I love trains and I hate to bash or bring negative attention to anything to do with rail. But, I feel that at least someone should point out this mistake.

An open response to Anthony re: The Problem with Parking

This is an email response I sent to Anthony Reinhart, who wrote a fantastic article on parking lots in the Innovation District in Kitchener.

I’d love to get your feedback on my ideas and hear what you have to say on the subject.

Hi Anthony,

Thanks so much for writing your “Problem with Parking” article on View From The Loo. I’ve seen you around the hub; I work with Ivan on Will Pwn 4 Food.

It’s an issue that’s dear to my heart, especially since I spent 5 years in walkable, lovely downtown Guelph. After getting the gig with Ivan, I knew that I’d have to move here, so I found a spot to rent across the street from Communitech on Victoria (I’m right across from Oak St., near the green Vidyard home).

I use my car to go a few blocks, just as you said, and I hate it. I would never have done such a thing in Guelph. After living here for 10 months, there are certain things that make being a pedestrian almost impossible.

We need a pedestrian-first mindset in this city. Here’s what I think needs to change to support that:

  • 40km/h speed limit in the Innovation District, rather than the 50km/h default, strictly enforced
  • All intersections default to crosswalks on. Currently, if you don’t press the crosswalk button on the corner of Victoria and Joseph (Communitech’s location), you are not allowed to walk across the street even when the light turns green (and lasts < 10 seconds I might add)
  • Pedestrian crossing light on Joseph for people who park in the stone parking lots behind Communitech. Currently, everyone j-walks and it’s very dangerous, especially in bad weather
  • A “scramble” crosswalk at the corner of Charles and Francis, giving us tech workers quick and easy access to food downtown without fear of being run over (I see many people crossing diagonally already)

To help support the discussion on this topic and keep the ball rolling, I’m going to CC this email to my blog. Is there a forum I can link to, as well, in case people have responses?

Best,

Sleep better with f.lux

I didn’t believe in this software until I tried it. It started working the first day I used it.

Essentially, it changes the colour temperature of your screen depending on what time of day it is. If it’s daytime, your screen will be normal. If it’s nighttime, it’ll be more red. Apparently there’s a lot of research data to suggest that this helps with sleeping patterns of humans (something to do with the sun). Anecdotally, I can verify that it has helped my sleep patterns immensely.

I just wish it had the incremental colour change option enabled by default. Out-of-the-box, it switches from blue to red quickly after you cross a certain point of day. I like it when it’s more subtly done.

http://justgetflux.com/

UPDATE! Just a few days after this post, the developer of f.lux updated the software after a period of inactivity. The new features include the ability to go even dimmer during evening hours and a new “movie mode” which disables f.lux for 2.5 hours (long enough for your average movie to play through).

Some web traffic data for my part-time video game blog

When I first started making websites, I went looking for web traffic data for other people’s websites in an attempt to set a sort of realistic goal post. I wanted to know: what sort of traffic is realistic for a site that’s just starting out? How will I know if the site is successful or popular? Unfortunately, there isn’t a lot of information available. So, for those that come after me, here’s what my video game site, GameBlaster64, has looked like traffic-wise since day one (January 20, 2011).

Traffic for gameblaster64

Click/tap for larger image

My site is not entirely popular, but it’s not barren, either. I post sporadically, maybe once every week on average. The content quality is good, though: all of the posts are original articles, not found anywhere else on the web. I’m always on-topic and share my posts on Facebook, Twitter, G+, and Stumbleupon. I don’t pay for traffic. All of this is organic.

Looking at the data, I find it interesting that although I have hundreds of articles, the ones about popular or trendy topics are right at the top of the popularity chart. Though it’s only one data point, my site’s traffic data seems to support the notion that following trends returns greater interest than long-tail but focused content, i.e., writing things about Minecraft is more popular than covering older/indie/non-mainstream titles or news, even if the latter is much more numerous in post count.

I do run ads from Google AdSense and make some money from Amazon affiliate links, but it’s not enough to quit my day job. Not even close. Still, it pays for our Minecraft server, which is professionally hosted in NYC by the amazing people at Nuclear Fallout. And, I get enjoyment from the creative outlet, covering the industry I love.

Relaunched: Going Debt Free

Going Debt Free Logo

The new logo.

Just a quick post to let my readers know that, after a 3-year hiatus, I’ve relaunched Going Debt Free. I cover the reasons for the downtime in a few blog posts, but I want to focus on moving forward with the agenda and building the site back up (and surpassing) its former glory.

The newest change? I’ve upgraded the software to enable the WordPress Network feature, which gives my readers their very own blog on the site. I want to try and build Going Debt Free into a solid repository of great articles written by people on their very own path to debt freedom.

Here’s to making Going Debt Free grow during the rest of 2013.