As a PC gamer in 2016, you probably have at least a couple of game clients running in the background on your system all or most of the time. Steam is the most well-known, but several of Valve’s competitors have decided they don’t play well with others and require separate clients for their games. Today I’ll be looking at the impact of five total pieces of software: Valve’s ubiquitous Steam, EA’s Origin (notably required for Bioware RPGs, Battlefield, and EA Sports titles like FIFA), Ubisoft’s Uplay (notably required for Assassin’s Creed, Far Cry, Rainbow 6: Siege, and The Division), Blizzard’s Battle.net (technically only required for Hearthstone, but anyone who plays World of Warcraft, StarCraft II, Diablo III, Heroes of the Storm, or Overwatch almost certainly launches them through Battle.net), and Riot’s League of Legends launcher (required for SimCity 3000, obviously). Continue reading “The Impact of All Those Game Clients”
The MacBook is the new ThinkPad.
There was a time when “laptop” and “ThinkPad” were practically synonymous. Whenever you saw business people or students pull out a computer in a meeting or classroom, it was almost certainly going to be a thick, black chunk of plastic emblazoned with the colorful IBM logo and a trademark red nubbin nestled in the keyboard. Today, though, when you think of a laptop, you probably think of something different. You probably think of a thin, aluminum-clad machine with a chiclet-style keyboard and a glowing Apple on the back. The MacBook is the new ThinkPad, having invaded corporate conference rooms and college classrooms across the nation (not to mention your local Starbucks). There’s a good reason for this: Apple makes damn fine notebooks, while IBM no longer makes them at all. It’s interesting, though, to ponder on how we got to a place where the once-scrappy underdog of computing became the industry default. Continue reading “Nobody Ever Got Fired For Buying Apple”
Merely the best mainstream tablet.
“What good is an iPhone that can’t make calls and doesn’t fit in your pocket?” That’s what I (and many others in the tech enthusiast community) said upon the announcement of the original iPad back in the far off antebellum past of 2010. The whole idea of an ARM-based tablet running a smartphone OS seemed absurd. Tablets were specialty tools for students and professionals who needed the ability to take notes and annotate presentations on the go in a way that a traditional laptop wouldn’t easily allow. And here comes Apple, trying to suggest they should be mass-media consumption devices. “Harumph,” we said.
How does that old saying go? “Time makes fools of us all.” The iPad was a runaway success that spawned a legion of competitors, and tablets have all but replaced small sub-notebooks and netbooks. Now they even threaten to overtake traditional laptop sales in the coming quarters, though as per usual the market Apple found the ideal time to capitalize upon seems to have run away from them. Once thought to be the new crown jewel in Apple’s mobile empire, the iPad has seen its market share toppled by Google’s Android, while the realities of competing without carrier subsidies have never allowed it to be the cash cow the iPhone has become for Apple. Some speculated after the launch of the iPad Mini in 2012 that Apple might refocus on the more popular sub-8” tablet market and leave the traditional flagship 9.7” iPad to rot. This doom and gloom quickly proved misplaced, however, when the iPad Air launched in 2013. The largest iPad shed substantial weight and girth, slimming down to be lighter than many competing 8” tablets. Then, late last year, it was refreshed again with the predictably-named iPad Air 2. This device was not just thinner and lighter still than its immediate predecessor, it also picked up some new tricks. Externally, the iPad Air 2 picks up a higher-resolution camera sensor and TouchID-enabled home button, but as always it’s what’s on the inside that counts. While the original iPad Air shared the same 1 GB of RAM and A7 system-on-a-chip (SoC) as the iPad Mini and iPhone 5s it debuted alongside, the iPad Air 2 features a specially-designed A8X that brings two things to the table never seen in a previous iOS device, even the new iPhone 6 Plus: a second gigabyte of RAM, and a third processor core. Combined with a monstrously powerful GPU co-designed with Imagination Technologies and a desktop-class 128-bit memory bus, Apple is competing on specs like never before. The message is clear: Apple is through playing second-fiddle in a market they exploded. The king wants his crown back.
I’m trying out something new with this device: a ‘brief’ video review. This one has some limitations related to my setup and being my first go at such a task, but it’s definitely a process I want to work on improving and refining for future devices (and possibly even some editorial content). You may notice that content is coming at a faster clip these days, and I fully intend that to be something that continues, with a full review every couple of months and (ideally) one to two shorter news or editorial articles a week. Anyway, the first official Blag-o-nets video review should be embedded here, barring any WordPress-related disasters:
An expensive, slow, highly compromised, likely very buggy piece of the future.
My initial reaction to Apple’s recently announced MacBook was, like many in the tech enthusiast community, one of visceral rejection. The idea of a machine that scarified performance, keyboard quality, and any expandability whatsoever in the pursuit of being absurdly thin seems at first blush like a fool’s errand. And in many ways, it is – there’s simply no denying that this new MacBook is a severely compromised device. But it isn’t alone among first-generation Apple products in that regard. So having had a few weeks for the news to stew, I thought I would go back and take another look at this latest fruity endeavor. Continue reading “Some Quick Thoughts on a Silly MacBook”
Sure, the iPhone 6 or Galaxy S5 or HTC One is much, much faster than the Lumia 635. Yes, they have higher-density displays, better cameras, more advanced wireless connectivity, and more sensors. But you could buy thirteen Lumia 635s for the price of an iPhone 6.
Recently, the Microsoft Store held a sale on the budget Lumia 635 smartphone, bringing it down from an already reasonable $80 price tag to a preposterous $49. As I’m sure you’re well aware, I have a fascination with both exploring diverse computing ecosystems and “good enough” computing, so naturally I snapped one up. The Lumia 635 mates a competent budget Windows Phone 8.1 experience to an LTE modem for a price only slightly above that of the older, 3G-only Lumia 520 (which as of time of writing is available for an equally absurd $29). So I have spent the last week or so using the Lumia 635 as my daily smart phone in place of my Galaxy S5 in order to get a feel for where both this device specifically and the Windows Phone 8.1 ecosystem lie. What follows, then, is not so much specifically a review of the Lumia 635 specifically as a piece of hardware, but more of a loose treatise on both Microsoft’s place in the smartphone economy and the general state of budget-vs-flagship smartphones.
Mac OS has changed. There was a time when Macs were the machines of creative professionals and UNIX nerds first and foremost, as high prices and lack of software compatibility left the mass market appeal of the machines marginal at best. Then, in 2006, the winds of change began to blow. Apple transitioned to Intel’s x86 processors and with the change came increased software compatibility – at first in the form of the ability to run Windows on Macs, and increasingly in the form of more ports of popular software now that the largest barrier to that porting had been torn down. The following year, Apple released the first generation of their iPhone to mixed press reviews but surprising consumer response. Apple was finally a mass-market force in computing, and they could see clearly that the route here was with less expensive but less capable hardware. In 2010, they released the iPad – the new entry point to Apple computers, now with about 99% less actual computer. Now, iOS device sales account for an enormous percentage of Apple’s revenue and as a result get a disproportionate amount of their development resources. This has resulted in OS X releases that alternately change very little (10.6, 10.8) or are predominately imports of changes from iOS (10.7).
This results in an awkward situation for those of use who use and like Macs a great deal, but do not share the same positive emotions for the less capable, more locked-down iOS ecosystem. While to some extent “ease of use” has always been a central priority of Mac OS, OS X owes a lot its popularity in the development community to the fact that it is a powerful and versatile UNIX-based operating system. Sometimes, it seems as if Apple have forgotten just how important that is to some of us, as they shovel out updates filled with useless iOS imports like Launchpad and Game Center.
So it is with some relief that we find Mavericks stripping some of the insanity back out, and focusing on important updates to the core operating system. Things like “full support for OpenGL 4.1” and “timer coalescing” may not mean much to many users, but they’re music to the ears of developers and enthusiasts alike. These under-the-hood improvements are accompanied by the excising of much of the insufferable skeuomorphism that has crept into the OS over the years – but, reassuringly, without the integration of iOS 7’s flat pastels. Yes, once again OS X is taking its own direction, and that’s darn refreshing – even if it is simply for a few visual touches.
Ever since the launch of the original Kindle Fire in 2011 showed the world that a $200 tablet didn’t have to suck, the market for smaller, media-focused tablets has exploded. Google seized upon this opportunity to show the world how it’s done just over a year ago with the introduction of the Nexus 7. The flagship Android tablet packed a high-quality 1280×800 IPS LCD, quad-core Tegra 3 CPU, and all-day battery life into a slim frame that weighed barely a third of a kilogram. While that spec sheet would have been impressive at almost any price, Google chose to go for Amazon’s throat with the Nexus 7, pricing it directly opposite the Kindle Fire at a mere $199 for the 8GB model. The Nexus 7 sold incredibly well – especially for what is, at least in theory, a device aimed mainly at developers – with most estimates placing it with over 10% of the Android tablet market. Google clearly had a hit on their hands, and as Google I/O 2013 neared, the Android community was abuzz with rumors surrounding the Nexus 7’s surely-imminent replacement. It was, after all, the one-year anniversary of the original Nexus 7 and Android 4.1 “Jelly Bean” launch. Much to our surprise, however, that event passed largely uneventfully, with no announcement of either a new version of Android nor a new 7-inch horse for it to ride in on.
Rumors continued to swirl, however, and when Google announced a press conference on July 24th, there was little doubt what they had in store for us. Sure enough, that morning Google officially announced to the world the heir apparent to the 7-inch tablet crown, simply enough simply called the Nexus 7. The name says everything that needs to be said: This is everything that was awesome about the Nexus 7 you know, but better. It’s a bold promise to make, but one Google were intent to deliver on. So does 2013’s Nexus 7 live up to the lofty standards set by its predecessor? Read on to find out.