The problem with Adobe’s Creative Cloud 2014 Edition

I subscribe to the Adobe Creative Cloud. One way or another, I’ve been using the Creative Suite for essentially ever. I don’t use most of it, but the things I do use are extremely useful. Photoshop, of course, is a fabulously useful tool. Although I use Pixlr when I just have to quickly grab and crop/resize images for a blog post or whatever, Photoshop is my go-to for anything more complicated. For a while, I tried, but the way it treated layers screwed me over one time too many.

I use InDesign on occasion, because it just makes for good-looking things (and it makes way better tables than Word does). It’s much more complicated than I need, but when I do need to do something a bit more technical, I’m glad I’ve got it on hand.

But the former artist-formerly-known-as-the-production-premium-suite is the reason I spent the money in the first place. For a wannabe filmmaker, it’s an all-but-essential purchase. While Premiere Pro may not be the industry standard, it’s become the editor of choice for lower-scale productions. Premiere Elements was the first software I really used to edit back in the day, and so I feel some brand loyalty, but it’s also a great program. And the way everything in the Creative Cloud links together is spectacular.

With my most recent project, Miranda, I used the whole thing: importing footage through Prelude, editing video in Premiere, editing sound in Audition, color correcting in SpeedGrade, and looking at After Effects but never actually needing it. (I did use it for a single shot during one of my tests, but it was unnecessary for my final production.) While friends were using multiple program by different developers, I had a consisntent and complete package. I could right click and send it on over to another Adobe program, and I would be able to figure out some of the basics almost immediately. That is awesome.

And it’s the reason why you buy Adobe software.

So what does that have to do with Creative Cloud 2014? Well, it has everything to do with it. When I signed up for the Creative Cloud, I did so under a pretty simple pretense: I wouldn’t have to worry about versioning anymore. I had used CS5, 5.5, 6, and I was over needing to keep track of that kind of thing.

I also did it for the sake of compatibility. With the exception of Photoshop, pretty much every program update makes files incompatible with previous versions. If I create an InDesign document in CC, my friend using CS6 can’t change it. I thought that would never happen again.

But CC 2014 changed all that. When Adobe announced that “What’s New is New Again,” I just assumed that I would open up the Creative Cloud app and see updates for all of my existing apps. The applications I currently had installed would be the same in four years that they are now. Instead, I saw updates for most of them and a whole bunch of new programs for me to download and install. While complaining about that may come off as whiny, there’s a problem: Not all CC-compatible plugins are CC (2014) compatible.

That’s a problem.

And it’s exactly the problem that I thought the Creative Cloud was going to fix. But no, and the fact that it’s named by year means that we may very well have annualized CC updates. I mean, who’s going to want to be using CC 2014 in 2015 or beyond? Nobody. That’s dumb, and it also defeats the basic conceit of “Big features as they come.” These features were clearly held back for the big reveal. Yes, there have been some cool things added since I subscribed, but this wasn’t part of the deal that I thought I had signed up for.

A trickle of new features not be as sexy as, say, a flood, but so what? They could even make announcements every year, “These are the things we’ve done for our subscribers!” If anything, it has something of a backwards effect, because people won’t expect much from the middle of the year. The program becomes far less interesting for that.

And here’s just a general question: Who is Adobe selling to? Adobe makes software for professionals (or at least semi-pros and advanced amateurs), all of whom know about Adobe. Heck, Photoshop may be one of the most well known programs in the history of the world. Even if people have never used it, they’ve definitely heard of it. Much to Adobe’s chagrin, “photoshop” is just a part of our language now.

So everyone knows about Adobe, and they know about the programs. What does undermining their message about the benefits of a subscription really get them?

My mild annoyance.

I hope they’re happy.

Let’s talk about E3 2014: Nintendo’s Digital Event

Ask anyone what the biggest surprise of E3 was and they will almost invariably tell you it was Nintendo’s Digital Event. Nintendo had to prove to the world that the Wii U was a viable console, something that they would want in their living room to go beside their Xbox One or PS4 or whatever. Nintendo consoles have long been second boxes, something you have just to play the exclusives. Third parties have all but abandoned Nintendo for the past four generations, but Nintendo has some of the most talented developers in the world under their wing and they consistently (though infrequently) put out some of the best games. That infrequency is key, though, because while there are definitely some good games for Wii U owners, the release schedule is a trickle rather than a stream. That’s true for the Xbox One and PS4 as well, but the Wii U has had an extra year on the market to grow. By the time Sony and Microsoft hit their second year, they’ll be in a much better position than Nintendo was.

But right from the first moment, the Digital Event was something weird. Really, really weird. I’ve never watched Robot Chicken, but I recognized the look, and for a moment I wondered if I had tuned into the wrong broadcast (although having seen the Mega64 video with Reggie “Fils-a-Mech,” I shouldn’t have been surprised). But then there was that ridiculous(ly amazing fight) between him and Nintendo’s CEO Satoru Iwata, and it was already the best thing that had happened at E3. If you didn’t watch it, do so. It’s pretty great:

For those who didn’t attend E3, Nintendo had by far the most compelling presentation, and not just because of the Digital Event. For hours each day, they were streaming live playthroughs of unreleased games, letting everyone who wasn’t at the show get a taste of the action. It’s the sort of thing I couldn’t imagine from any other developer, but I also didn’t expect it from them. Nintendo has a reputation for being behind the times, but this year they showed that maybe they’re actually ahead of the curve. They had to appeal to people and get them interested; they did so.


Let’s talk about E3 2014: Sony’s Press Conference

Sony’s press conference this year was always going to be a disappointment. It was impossible to really follow up last year’s brilliant showing, because last year was a perfect storm of their new hardware announcement mixed with the incompetence of their competitors. With Microsoft’s reasonably acceptable conference this time around, there was simply no way for Sony to crush them again. And they didn’t, but that’s not to say Sony didn’t have a good showing, because they did… it just wasn’t that good.

Sony’s biggest problem was length. Microsoft’s conference was around 90 minutes and Nintendo’s (which I’ll get to tomorrow) was half that. Sony’s ran nearly 2 hours, which was… too much. They showed a lot of things, but the momentum was totally lost in the middle when they began to focus on the PlayStation TV and whatnot. That was when games, games, and more games became talk, talk, and boredom. In general, the presentation could have used some serious tightening, because that could have brought it from pretty good to downright awesome. Because they were showing some pretty cool stuff.

Like No Man’s Sky. We’re starting there, because that was the coolest thing at the conference (and easily one of the coolest things at the show in general). Yeah, the game will be available on PC, so it loses a little bit of its wow factor at a Sony-specific presentation, but you know what? Shh. That game looks a-freaking-mazing. In the past few years, indie developers have been showing up the big guns on a consistent basis, and this is an excellent example of that fact. The stuff that they’re doing in that game is on a scale like nothing else. Even Minecraft, which is arguably the closest thing to it, doesn’t really come close. That game is truly singular, and I’m extremely excited. Other indie games like ABZÛ and Entwined were pretty great as well.


Let’s talk about E3 2014: Microsoft’s Press Conference

For most intents and purposes, E3 is over. Today is the last day, but there’s not going to be much (if any) news coming out of the event, and the remaining previews could be interesting, but likely won’t make any major impact on anybody’s impression of the show. So let’s talk about it. And let’s go in order of appearance (at least for the big three). First up: Microsoft.

Microsoft had a lot to prove. The Xbox One reveal was overshadowed by the always-online controversy and the way more interesting Sony conference. They stumbled out of the gate where Sony soared, and with the recent reveal of a Kinect-less system, the Xbox One seems to be less and less like the futuristic piece of hardware that the company envisioned (and that I had tepid praise for when I reviewed it back in November).

For better or worse, the Kinect was an integral part of that system, and it was part of the reason that it stuck out from its competition (I truly wish the PS4 eye had the voice-recognition capabilities of the Kinect). Now it sticks out as being notably less powerful (although the removal of Kinect will apparently lead to a marginal increase in graphical fidelity)… and that’s about it. At the equivalent price point, there is almost no reason to buy an Xbox One over a PlayStation 4.

Microsoft had to change that. Because the only thing that it has to fall back on is its library. In that sense, Microsoft had a reasonably good conference, because it did focus on games, with announcement after announcement that made a decent case for the continued existence of the console.


My Moto X

A Review of the Moto X

Something About Me

(Feel free to skip this section and go right to the next header, at which point the review will really begin. But this is my blog, so let’s be indulgent for a moment.)

I’ve always thought of myself as a “power user,” whatever that was supposed to mean. I wanted bleeding edge tech and software, and was willing to put up with some instability to get it. For a long time, I thought that was as true of cell phones as it was of computers. I read numerous reviews of every phone/tablet/whatever released, especially ones. But I no longer salivate at the prospect of the strongest hardware, because I’ve realized that I don’t need the most powerful phone on the market. I need the most useful.

When the Moto X was announced, it seemed to fit that description. Top of the line then, and it’s definitely not top of the line now, but it still felt like the right phone for me. And that has a lot to do with how I use phones. So first, a little history:

I’ve been using Motorola devices for most of the 11 years that I’ve had a cell phone. Back when I used flip phones, I cycled between LG and Motorola, but I’ve stuck with Motorola ever since joining the smartphone world four years ago. It’s not for any brand loyalty reason; they’ve just met my needs every time I’ve been able to upgrade. I started with the Droid X, shifted to the Droid Razr Maxx, and am now on the Moto X. Back in the day, Android was still going through series growing pains. I was on Android 2.1—Eclair—when I received the phone, an operating system that functioned but was Ugly and much less impressive than its competition. While using that phone, I saw it grow with Froyo and Gingerbread, and I ended up rooting the phone and trying out Ice Cream Sandwich before switching to the Droid Razr Maxx.

That phone only saw one update, from 4.0 to 4.1—Jelly Bean, although my Nexus 7 has seen me all the way through to 4.4—Kit Kat. As a platform, Android has matured greatly, and it’s finally gotten to the point where it really just works the way it didn’t four years ago. I’m glad I chose to put myself into the Android ecosystem back then, because it really has become a great platform. The Moto X also runs Kit Kat, but it’s got a few key additions that will play into this review.

So let’s talk about the phone.

But first, let’s talk about the process of buying the phone.

Moto Maker

One of the most attractive things about the Moto X (and Motorola’s recent line of phones in general) is its customizability. While the lower-end Moto G and Moto E have removable back plates that allow for some measure of personalization, the Moto X goes full bore, allowing would-be buyers to choose front, back, and accent colors, as well as a signature and some minor, cute software tweaks (such as a personalized boot-up message).

The Moto Maker experience is a surprisingly fun one, and I was testing out color combinations months before my previous contract was up. Some of them looked awesome, most of them not so much, but it reminded me vaguely of a more limited version of the Creature Creator that EA released for Spore back in the day. Even before the game was available, people were putting together awesome designs. I’ve been told that different colors have different materials, but I can’t speak to that difference. (I would assume that the wood-backed options also have a very specific feel, but I kept mine plastic.)

Every phone purchase I’ve made up until now has given me only a few options, and usually only one of them is my style. I want something darker, so I’m usually left with all-black. And while that’s fine, I thought it was time for a change.


Aaaaa! Force = Mass X Acceleration start screen

Tablet ports, smaller screens, and the 7″ Aaaaa! experience

One of the most intense video games I have ever played is Dejobaan Games’s AaaaaAAaaaAAAaaAAAAaAAAAA!!! – A Reckless Disregard for Gravity. It’s a skydiving simulator of sorts, and when things start to speed up and obstacles get packed closer and closer together, it becomes a uniquely exhilarating experience. And that exhilaration is something I really haven’t gotten elsewhere. It’s gotten a semi-sequel in the form of AaaaaAAaaaAAAaaAAAAaAAAAA!!! for the Awesome (which is easily my most anticipated VR game experience once I get my hands on an Oculus Rift), and as I learned recently, a port to mobile devices. I was looking around the Google Play store and saw that AaaaaAAaaaAAAaaAAAAaAAAAA!!! – Force = Mass X Acceleration was on sale for 99 cents. Generally, I don’t buy or play games on either my phone or tablet, but I figured I should make an exception here.

So I booted up my first generation Nexus 7 and downloaded the game. It’s much like I remember, except instead of WASD+mouse it uses tilt controls, which is an interesting choice. Using an accelerometer as the primary method of control can definitely work, but unlike, say, Ridiculous Fishing, Aaaaa! Requires a person to be static and also be hunched over. (The device has to be essentially flat horizontal for the tilts to be read correctly.) Ridiculous Fishing, because it’s only concerned with one axis of motion, can be held any which way, and while playing in a car is not ideal, it’s doable. Aaaaa! Is not. The extra axis being tracked means that any and all little bumps will register and the levels that require perfect precision are nigh unplayable. That being said, even though the tilt stuff is generally good and I’ve gotten into the flow of it a few times, it never feels quite as fluid as WASD. I have somewhat shaky hands, and I felt like a lot of deaths weren’t because I didn’t move in time but because the game didn’t accept my input in time. On the PC, I always felt like it was my own fault. Here, I could blame the device.

But this isn’t about the success or failure of the game mechanically, it’s about the impact of downsizing.


Delsin from Infamous: Second Son

Infamous: Second Son, black-and-white morality, and the awkwardness of being an evil “hero”

In Infamous: Second Son, you play a mass murderer named Delsin. You can play as a not-mass-murderer, but then you’re not playing the game right (it is called “Infamous,” after all). The only thing that matters is that Delsin is a guy who gets super powers that allow him to kill pretty much every human he comes across with fire missiles that he shoots from his hands. It’s a totally awesome feeling, and the game is basically the epitome of a power fantasy, but there’s a problem with that, because the narrative tries to paint Delsin as something of an antihero when he is a straight up villain.

The actual “villain” in the game is a woman (which is rare and kind of cool, in and of itself) who tortures people using her own superpowers. Which isn’t cool or whatever, but the thing is, she’s not the one wantonly launching herself up into the air and killing dozens of civilians at a time. That’s Delsin doing that. And while it’s super cool looking and uses all sorts of fancy particle effects that show people why they should buy into the current generation of video game consoles, that’s you, the player, killing dozens of civilians (there’s a combo counter in the upper left hand corner that tells you for sure) at a time.

But you’re still the good guy.

Early-ish in the game, you’re set to run after a sniper who has killed 12 innocent people. Delsin’s brother, a police officer who for no clear reason continues to help his serial slaughtering sibling, says that she must be stopped at all costs and blah blah blah.

Thing is, by the time I got to that mission, I was routinely massacring 12 innocent people while attempt to cross a street. They’re just there, being dumb and not driving around cars that have clearly been destroyed. (When you see a six-car backup behind a smoking rubble, you’ll feel even less bad about blowing up the little blobs of textures and triangles than usual.)

But the game still treated me as though I was the better man. And it turns out that the sniper was actually taking out drug dealers, so they weren’t even really innocents. So Delsin (AKA the player) is measurably worse than the sniper that he has been sent to stop by a police officer who, upon learning that his brother has been killing people because reasons that are honestly too stupid to even dignify, basically just said, “Well… you know that’s not a good thing to do, and I’m kind of mad at you or whatever… but I’m still going to help you because the real evil is out there.”

Which is bullshit.


BlackMagic Cinema Camera 4K

My weekend with the BlackMagic Cinema Camera 4K: Impressions

For the last three weeks, I have been spending pretty much every waking hour (and most of my sleeping ones) involved in the production of two thesis films, one of which has wrapped principal photography, and the other of which has only just begun. The content of the films doesn’t really matter for the purposes of this (one is my own film, and I’ll probably be discussing that further down the line), but I wanted to talk a little bit about the experience I’ve had shooting, specifically this past weekend when I filmed with a BlackMagic Cinema Camera 4K. Over three days, I spent between 20 and 30 hours with the camera, so I feel like I have a decent grasp of what it is, at least enough to have some general thoughts about its successes and failings.

So I’m going to talk about them. And I’m probably going to make it sound like I’m way more knowledgeable about this than I actually am. Especially in this first part. Also, still kind of sleep deprived, so this will probably ramble a bit more than usual… Hopefully it makes sense to the people who are actually going out of the way looking for something like this. It’s almost like a review, except totally not in depth about the things people would be looking for in an actual review.


First up, the good: the 4K ProRes footage that comes out of a BMCC is spectacular. My own film was shot using a DSLR (550D) and an EX-1 for action sequences, and while I am a fan of the way DSLR footage looks, it couldn’t possibly compare. I haven’t shot using the 1080p ProRes mode as a point of direct comparison, but I’m sure that is also a whole lot better than what I’m using. (It’s also probably less good than the RAW footage that Magic Lantern has pulled out of the 5D Mark III, but that’s another thing entirely.) Even in camera, especially when using Film dynamic range (as opposed to “Video,” which keeps more of the colors and requires less grading after the fact, I guess, but I don’t really see why anyone would want to use it), it looks so good.

Which it better, considering how enormous the files are. I was told that 4K ProRes files (there’s no RAW recording, which is completely ridiculous, especially considering the lower end camera shoots 2.5K in RAW, but that’s another thing entirely) were about a gig a minute, which at the time seemed impressive, but actually mathing that out is less than 17MB/s and is actually about what my DSLR’s SD card maxes out at (and is slightly less than the speed of shooting 540p RAW on a DSLR and is more like double what my DSLR generally shoots at). But in reality the files run over 5 GB a minute (a 1 minute 48 second take was approximately 10.56 GB, which is somewhere around 100 MB/s, which is similar in size to the 1080p RAW files that come out of a 5D Mk III while having four times the resolution (not necessarily worth the tradeoff, depends on how much you care about 4K recording)), which means hard drive space comes at a premium. My entire film is about 120GB, which translates to probably 3 hours of footage plus audio. In the first morning of shooting on a BMCC, we had broken 200 GB. That is only surprising to someone used to working with such (relatively) small footage as what comes out of a DSLR when its bitrate has been jacked up to 1.3X the regular thing. People working with RAW footage used to that kind of stuff would probably even scoff at those numbers (the Canon C500 apparently shoots RAW at more than 5 times that size), but that sort of size requires a pretty significant amount of storage space (especially when you want backups of backups), and even though hard drives are rapidly dropping price, fast and large ones are still a not-insignificant cost.

But let’s stop pretending to sound smart. Let’s talk about the things that aren’t so much fun.


Family of Gone Home

Twists and tricks, or lack thereof: Gone Home’s most surprising element

Last week, I discussed the final episode of True Detective, specifically the fact that creator Nic Pizzolatto hadn’t made a show that was trying to outsmart the viewer. From start to finish, True Detective follows a logical progression and who followed it from week to week would get to the end and never feel like they had been tricked.

And in that way, Gone Home feels like the True Detective of video games. It’s not trying to trick anybody. It’s just a simple story about why a girl who goes home and finds that her family has disappeared and/or left her alone. The narrative path that leads the player from beginning to end is much more linear than I expected (though not in a bad way), and the ultimate answer to the driving question (where is her sister, Samantha?) can be answered within half an hour of booting the game up, even though it takes four or five times as long to actually see it through.

And in a way, I found that somewhat disappointing. As the actual reveal appears, the game fades to black, and I was shocked that it was over. It seemed so sudden, but the reason it seemed sudden was because I had been expecting it the whole time. For the narrative that’s being told, it was the only way to end. It’s not like the family was going to come home and suddenly it’s something else. It’s a game about solitude and loneliness. (That being said, Gone Home is canon with the Bioshock franchise, and if Booker DeWitt time-warped and destroyed the house or something, that would have been kind of amazing.)

But still.


The first victim from True Detective

True Detective’s finale: when symbols are just symbols

[Spoilers ahoy, though less for what actually happened in Season 1 of True Detective than what people expected to happen.]

The True Detective finale rendered me speechless, but not for the reason I expected. I expected the internet’s theories of horrific depravity to come true and the final hour to be one final spiral down into a pit of hopelessness. I wasn’t sure it was the last thing I wanted to see before I fell asleep, but I needed to know what happened to Rust and Marty.

And then… it wasn’t that. The opening minutes were uncomfortable (and stylistically odd, since I believe that was the longest period of time the show had gone without either of its protagonists being onscreen), and the big climactic chase was tense, but when everything came down to it, the episode isn’t about the darkness. The final line makes that pretty explicit, as Rust, the guy who has thus far spoken exclusively in long-winded metaphors filled with doom and gloom, thinks that maybe the glass is half full. Or at least not quite so empty.

That’s interesting.

But I couldn’t help being a little bit disappointed, because in the days leading up to Sunday, I had followed the theories like everybody else, and I expected at least some of them to come true, but basically none of the did. People were expecting something far more complicated than they got, and while I’m not sure that’s a bad thing, it’s definitely an interesting thing.

But what makes it so interesting is how the most seemingly plausible theory of all (regarding Audrey’s childhood) was way off base.

The day after the finale aired, The Daily Beast ran an interview with Erin Moriarty, who played Audrey, talking about her role in the whole thing ( In some ways, it was the most interesting thing I read that day, because it definitely answered a question that had was never supposed to be there.

When the credits rolled, I was still reasonably sure that all the evidence pointed to something weird that simply went unaddressed, but Moriarty’s responses made it pretty clear that the things everyone thought were clues of an abusive past were really just foreshadowing for the future in the series. Nothing particularly horrific had happened to her.

(And the fact that the “Five Horsemen” concept was kind of dropped was also interesting, although there were obviously more people to be had as part of the conspiracy who simply denied playing a part in it, and there was nothing the men could do to prove that final link.)

But this idea that while everything means something, not everything actual means what it at first, second, and even third glances might appear to is worth considering.