19th May 2011
There’s been much ado in the past few months about cloud-based music players. Amazon shocked many by announcing in March that they would begin allowing users to obtain up to 20 GB of free cloud storage to keep their mp3s and play them anywhere with internet access. Not to be outdone, Google announced at their Google I/O developer’s conference the beta version of Google Music, a cloud service similar to Amazon’s but with the missing participation of record labels (Google Music doesn’t sell music directly). Today, Apple announced that it has reached a licensing deal with EMI, with other major labels to follow, setting the stage for “iCloud” or a similarly named service.
While I’ve consistently defended and supported cloud-based services, these recent attempts at creating cloud-based music libraries all seem to be missing something that make it harder to compete with locally based music libraries.
Time to define user library.
When I open iTunes on my computer, I can immediately begin playing the music I want to hear. It’s been carefully stored, grown, and curated over time to reflect my tastes and I’ve spent a great deal of time and mental energy to get it that way. I don’t want to have to spend more time uploading to the cloud so I can access my library “anywhere.” As it is, I can already access about 10 gigs of my library at time on my iPhone, and that only requires the occasional sync. This is a problem for Google and Amazon, who’s cloud music services require the user to actively upload a library. Apple, on the other hand, will most likely create a feature that scans the iTunes library and syncs it with their cloud service, eliminating a potentially time-consuming step (assuming you use iTunes).
Integration with The Endless Jukebox
Google, Apple and Amazon are tech giants, nay titans. But in the specific realm of music discovery, the cloud-based services they offer will be barely comparable to the lords of discovery—Pandora, Rdio, Rhapsody, etc. Internet radio services have existed for nearly a decade in one form or another, and their importance to the process of finding new music and grouping similar-sounding songs can’t be overstated. In fact, as more and more music pours into the marketplace, the importance of these services grows exponentially. While Amazon and Apple serve up fairly good suggestions based on past user purchases and current libraries, they can’t hope to match the ability of a company like Pandora, which categorizes each track based on over 400 characteristics. Google is left in the dust by this measure, since (so far) they don’t offer any suggestions based on user tastes. Most importantly, these 3 cloud music services are merely a copy of user’s library in the cloud, with no way of accessing the broader music world without the addition of more library tracks. This is a serious drawback, and won’t challenge the positions of the internet radio kings.
Local and clunky vs. potentially inaccessible.
No one can argue that having your entire music library available from any computer (or smartphone with an internet connection) is exceedingly handy. How many times have I wanted to hear a particular song from my library, but that song had not been synced to my iPhone and I was forced to YouTube in order to hear it? Cloud-based music storage would eliminate this problem, but it may create others. Access to the cloud hinges on a moderately good internet connection, which we all know is not to be taken for granted. Imagine a roadtrip through rural parts of the U.S., or a 5 hour flight on a plane with no wi-fi. These are scenarios where access to the cloud is limited or non-existent and present very real obstacles to users. The argument could be made that users always have the option of local music playback, but people tend to grow complacent when presented with a new convenience. Perhaps, given an affinity for cloud-based music, users will stop tending to their local libraries. This is where Google Music excels, offering a feature to “pin” user-selected tracks to a local library and allowing for playback even if the device is disconnected from the web.
Taking a trip to Europe? You’d be out of luck trying to access your music form Google’s or Amazon’s services. Thus far, these cloud-based players are only available in the U.S. No word yet if Apple’s service will be any different.
None of the obstacles mentioned above are deal-breakers, however. These cloud-based services are still very cool and important steps forward in music portability and enjoyment. But on the whole, I believe that realistically these services only offer marginal advantage over locally stored music libraries. Until the above issues can be addressed, cloud-based music storage is merely a more lightweight method for transporting a large music library, since it eliminates the need for carrying around a laptop with 80GB of music and a decent set of speakers in case of an unexpected party. Of course, a 160GB iPod Classic solves that problem too.
9th May 2011
Last week Mashable.com launched Mashable Follow, a new personalization and social layer to their website. Their announcement video stated that this development was meant to help Mashable readers save time while discovering the content that matters to them. As the amount of content on Mashable grows, users are faced with spending more time sifting through this content, and thus, providing a method to narrow down story topics and to receive recommendations from a few (or many) trusted users would seem logical.
But this announcement left me scratching my head in confusion. Mashable produces its own content and reacts to content from the web. This is a fairly standard way of running a large site. Most, if not all, news, technology, and pop culture sites write their own articles about topics that (hopefully) will interest the reader. When a large story breaks and can’t be ignored, but there isn’t enough information yet to cover anything except the 3 or 4 facts that everyone knows, these websites will re-hash content. Again, this is standard practice. I see it almost every day on my Twitter feed—a story reported by one site’s feed is bound to show up on another’s with slightly different wording.
Aside from the socially entertaining aspect of Facebook and Twitter, I use these services to discover new information. This is not a groundbreaking revelation since I’m sure most do the same. The beauty of social networking, from a perspective of content discovery, is that a user is able to discover information from many sources all over the web. Some of this information may come from other trusted users (like the friend who always finds funny videos), while other pieces of information might come directly from trusted websites (like the @RollingStone Twitter account).
What puzzles me is why Mashable decided to essentially re-create the Twitter and Facebook experience inside their own website. It makes perfect sense to allow your readers to filter through content in order to quickly access the topics they care about. In fact, Mashable was so quick to implement this strategy that they created separate twitter accounts for separate high-level topics on their website (@MashableTech, @MashableVideo, @MashableSocialMedia to name a few). This is to be applauded, since this move took the standard filtering section links often encountered at the top of a news website and turned them into feeds, thus enabling readers to receive Mashable content from a specific subject area without having to visit the Mashable website to find these stories. But Mashable Follow seems to require that users sign-in with a Twitter or Facebook account, create a profile, select other users and topics for their feed… and then…. open Mashable in order to receive this feed? Why would I want to do that? Why wouldn’t I simply follow users and providers of specific topic content on Twitter?
Mashable Follow seems to me much like a department store with high brand loyalty. Mashable wants visitors to come to it. Which is fine, and I’m sure many people leave Mashable.com open all day to get new information. But it seems to me that the majority of shoppers don’t care if they go to a department store, as long as they can get the best price on a product they want. They might turn to the web for shopping, where a huge marketplace is attempting to sell you many similar products. Simialrly, casual users don’t care where interesting content comes from, as long as they have a way to access it easily, hence the popularity of link sharing services.
Mashable Follow has essentially taken the keyword tags associated with blog entries and turned them into individual microblogging accounts inside their own website’s microblogging service, then allowed flesh-and-bone users to create accounts and share links, comment, etc. Some of this system is useful in that it allows users to connect and share links with one another, but Facebook Comments and Disqus already did this fairly well, and across multiple websites. The rest just seems to be a mashup of social media components that, I think, work better in a larger environment.
But perhaps I’m misunderstanding Mashable Follow. Perhaps Mashable has so many dedicated readers who visit the site every day that this feature is truly useful to them. But to me it seems like just another account I “need to have” that I probably won’t do anything with. If someone can explain its usefulness to me, I’d love to listen: @SinecureLuke
28th April 2011
The news that Apple would be suddenly releasing a white iPhone 4G came the same week that the company found itself at the center of a furor over storing a year’s worth of location data from users’ mobile devices. This strikes me as a thinly veiled distraction. After all, the black iPhone 4G was released in late June of last year, with the release of its pale companion delayed until “later in the year” due to manufacturing issues. That potential 5 month delay has now lasted almost a year. Now, facing potential federal inquiry into this location data revelation, I would bet that Apple views the white iPhone manufacturing difficulties as a blessing in disguise.
Of course, many customers who wanted the white model simply bought the black version instead. But having made iPhones available to Verizon customers only a few months ago, Apple will most likely see plenty of sales of the extremely late arriving white phone. But, given the choice on a level playing field, would many people really choose the white iPhone over the black one?
I know that white has been Apple’s color for years now. It’s set them apart in many ways, making them appear both futuristic and clean. Yet, I feel phones are a very different animal, since people are likely to use a colored skin or case, thus making the original phone color basically a non-issue. I haven’t wanted a light colored consumer device since my SNES, and I certainly don’t want one now. White just looks out of place sometimes, and often is inferior to darker counterparts. To illustrate this point, here’s a list of other inferior white versions of things:
1) White Wayfarers - While these sunglasses can look cute on Southern California girls, most people who try to pull them off look silly.
2) White Chocolate - Every other form of chocolate is obviously superior to white chocolate. White chocolate has the least caffeine of all chocolates, and its only claim to be chocolate is weak at best—it contains cocoa butter, but no actual cocoa solids (which help give real chocolate its flavor and color).
3) White Eggs - This is really a personal preference, as shell color doesn’t change the taste or health benefits of an egg, but something about those white eggs… I don’t trust them.
4) White Jeans - I know white denim has gone in and out of fashion over the years, but it appears to be back in now, and can be used effectively in the summer. But unless the person wearing white jeans has a great sense of style, they usually end up looking like the world’s biggest Bon Jovi fan, circa 1986... or worse.
5) White Beer - Okay, this beer isn’t actually the color white. And white beers are delicious and particularly good in warm weather (and with food). I really don’t have anything against them except they’re far too easy to drink. Also they’ve led to people throwing sweet fruits in their beer, and I just don’t like that one bit.
6) White Crayons - Unless you’re using colored construction paper, this is a useless invention.
7) White Sneakers - I realize that most sneakers have at least some white on them, but I’m talking about mostly white sneakers that get dirty immediately. Obviously if you’re using them for athletic purposes, it shouldn’t matter if they’re dirty, but if my sneakers are going to turn colors, I’d rather have most of the color already be there when I buy them.
8) White Bread - Seriously? Thanks, I love bread that has no taste and is lacking in fiber.
9) White Michael Jackson - It’s very difficult to draw the line of racial distinction here, since MJ’s skin grew progressively lighter over time, but I think we can all agree that things really went wrong after the release of Bad.
10) White Rugs - Every apartment I’ve ever lived in has had white rugs and I loathe them. What better color to show the world all the mistakes you’ve made? Remember that time you didn’t realize that you stepped in dog poo? Remember that time you spilled some red wine? You don’t need to remember them because, chances are, your white rug does.
23rd April 2011
Last week in an article for Bloomberg BusinessWeek, Ashley Vance argued that, unlike the technology bubbles of the past which left us with personal computers and increased internet infrastructure, we may now be in a bubble based on social networking that might leave us with very little if and when it goes pop.
While Vance’s article was informative, it left me skeptical. I’m not an authority on market bubbles, but I’ve lived through some and the seemingly endless parade of socially networked services and blind desire to add social functions to apps and websites does seem to portend a social network bubble. But if this is a bubble, I don’t believe its bursting will, as Vance put it, “leave us empty-handed.”
The intense growth in popularity of social networking could be traced back to 2005 with the explosion of MySpace and outgrowth of its competitor (and eventual successor) Facebook. Like any ecosystem, social networking became rich with niches—photography, music, microblogging, video, travel, etc. Most of these subjects had been explored on the web in the past (remember Webshots?) the increased desire for social networking and cross-site integration created an entirely new crop of services. While some of these sites were powered by subscriptions, the majority relied on the Google-pioneered model of contextual advertising. In fact Vance’s article explores the data mining behind this advertising in intriguing detail.
But the rise of social networking isn’t the only legacy of the first decade of the 21st Century. While most of our gadgets have gotten much more social, they’ve also grown smaller and more portable. My first cellphone made calls, received text messages, and allowed me to play Snake. My present phone has a higher clock speed and more RAM than my first Gateway PC did in 1997, takes fairly good photos, browses the web, plays games… in short, the advances of this “tech bubble” now allow me to literally carry a computer in my front pocket.
But cellphones are not the only devices to shrink in size. Laptops have also gotten smaller, but not merely due to the tendency over time of more processing power fitting in ever tinier containers. The rise of the netbook, and more recently the tablet, speaks volumes about our changing consumer needs. These computers are smaller in part due to miniaturization, but also due to the realization that we no longer need portable computers to do everything that a desktop can do. Most users don’t need to run resource-hungry programs on the go. This conscious reversion back to less powerful machines reflects a maturation of the information society as a whole.
But scaling back the abilities of our portable computers would not have been possible without the most important legacy of the current tech period—cloud computing. Our harddrives can be smaller because files can be stored remotely; the presence of DVD drives is waning in favor of USB-connectable memory; our processors can be less powerful because many resource-hogging programs are now available as streamlined web-based applications; programs in general can be smaller and more targeted to a smaller set of tasks due to the availability of downloadable apps. In short, we’ve taken the computer and distributed it onto the internet, and now require only a smaller, less powerful machine to interact with those distributed components.
If we are in a new tech bubble, driven by social networking, and if this bubble does pop and wipe out a large number of social sites, we will still be left with our advances in computer portability and cloud distribution. We’ll merely be visiting fewer social music discovery and photosharing sites with our ultra-potable machines.
4th April 2011
…or “I Am the Scully to Tech Pundits’ Mulder”
It’s about that time of year again when the Apple rumormill starts spinning out of control. This time, it’s all about the iPhone 5. While with recent news, it’s way too early to start speculating (more on that later), that of course won’t stop stock analysts, tech pundits, superfans, and, of course, yours truly. What follows is pure speculation based on no sources; in fact, it’s based on nothing but pure conjecture on my part after living 20+ years in the Apple Reality Distortion Field. While I usually just break it down in long-form prose, this time I decided that I’d do a list. Because if there’s one thing the Internets love, it’s a list.
1. Bigger screen
Forget about it. Seriously, just stop. You can’t have it. Have you ever held a cell phone with a 4” screen? Doesn’t it feel ridiculous? Don’t you think Steve Jobs has held one? I bet the words, “This is ridiculous,” came out of the man’s mouth before he fired whomever brought it to him.
Dramatics aside, Apple won’t increase the screen size for practical reasons. You know they won’t drop the Retina Display, so the screen would have to be at least as high-resolution (326ppi). In order to do so, they’d have to increase the screen resolution by a factor of 1.5 or 2 in order to keep with the aspect ratio and not force developers to completely redo artwork, something that is widely accepted to be law within Apple. That means the screen would either be 960x1440 or 1280x1920, which works out to 433ppi or 576ppi with a 4” screen.
2. No home button
You mean to tell me that a non-obvious 4-finger gesture is going to replace a single click of a button? Seriously; no. Just no. Who starts these rumors? Cut this out.
3. Better camera
OK, finally we get to something with some truth behind it. As a matter of fact, this has been all but confirmed by the loose lips of a Sony executive who stated that their best image sensors were for the iPhone. Expect something like 8MP still, full 1080p HD video, with a zoom lens, probably with the word “Carl” on it. The front-facing camera will be a bit better, too, but don’t expect the world. However, according to the very same Sony exec, the sensors have been delayed due to the devastating earthquake and tsunami in Japan, which brings us to…
4. September release date
Sorry kids, no birthday present for June babies this year. Thanks in part to the delay in manufacturing in Japan due to the tsunami, and in part to production delays with iOS 5, the ship date for the iPhone 5 will likely be in September. It explains why there won’t be any announcement at WWDC this year. Expect the announcement to come during the annual “Buy These Brandy-New iPods for Your Kids This Christmas!” event in September, and for Jobs or whomever to say, “Available today at apple.com, and this Friday at Apple stores across the country.”
5. iOS 5
Alright, I already gave it away, but yes, it will debut with iOS 5. The new OS will make an appearance during the WWDC keynote, and may or may not be released to developers as a beta. I’m erring on the side of “Yes” for that one, in order to give developers a few months to iron things out before the September launch. On a side note relating to iOS 5, it will launch across all platforms at once, and possibly come with another new iPad in September with a higher-resolution display (maybe that’s the source for these bigger iPhone screen rumors). It’s a crapshoot at this point regarding new features of iOS, but I’m going to go with lock screen widget support; revised notifications; full access to Core Audio (including Audio Units), Core Video, and Core Image built into the SDK; and completely different aesthetics. If you love the high gloss look of iOS elements, best start saying your goodbyes now. If I’m correct, that is.
6. Other hardware rumors
Quick take time! Here’s the nitty gritty on the rest of the hardware stuff: No metal back; Apple had a tough enough time with metal sides. Sure, we’ll say there will be a white one. A5 chip all but confirmed. Don’t expect 4G support just yet. Standard-issue upgrades: more RAM, more storage, longer-lasting battery. Won’t be a groundbreaking design physically, just a revision. Possibly made with a carbon composite back instead of glass. I really don’t think we’ll see Apple abandon the Dock Connector for anything else (like DisplayPort or Thunderbolt as I’ve heard) for quite a while, since nothing else is as versatile or has as many connectors as the Dock Connector. Something I haven’t heard at all yet, but have mulled over is using what they learned on the Smart Cover for the iPad 2 and make well-designed accessories for the iPhone from the beginning.
7. Stupid rumors
There are a few rumors floating around that I simply cannot believe actually exist. That some people are believing them is worse. These include: a physical keyboard, a 3D display, no CDMA model, no home button (worth repeating here, because it seems to be so pervasive), brushed metal casing (seriously, Apple got rid of all traces of brushed metal years ago), memory card slots, and USB ports.
8. One More Thing…
I may eventually eat my words, but I don’t think there will be a smaller, cheaper iPhone. Ever. The closest we’ll get is what we already have: the iPod touch; a smaller, cheaper iPhone without the phone. The more I think about it, the less characteristic it would be for Apple to release a lower model iPhone. The target market for such a device would be people who want an iPhone, but can’t or won’t buy it at the current price point. Apple doesn’t want those people as customers.
iOS devices are all about the experience, not the spec sheet. If Apple introduces a cheaper iPhone that has 99% the experience of its big brother (and why wouldn’t it?), it will severely cannibalize sales of the high-margin big brother. If it’s something completely different, running on something like the OS for the iPod nano or a different version of iOS, it wouldn’t be the full experience. And if it isn’t the full experience, it won’t be nearly as popular. For proof, look at non-iOS iPod sales since the iPod touch was released. Which brings us back to the original point.
So there you have it: the iPhone 5 as it will be. In my opinion. Now all we have to do is wait to see what actually happens. And maybe stop with all this rumor-mongering. But one thing is for certain: I will be wrong about something.
Comments? Let us know on our Twitter @sinecureind or our Facebook Page!
31st March 2011
If you missed it, please read Part 1 of “What I Expect From An App,” from last week.
As I discussed last week, the sheer volume of available apps means tough competition and there is little room for developers to forget small details, lest they have their apps passed over for better, higher rated ones.
Continuing on with my minimum app standards:
There are currently over 40,000 free apps for the iPhone alone. Many of them generate revenue with advertising, and occasionally I’ll click on an ad, because why not? I appreciate that I got an app for free and if I find it useful, why not give the developer a few seconds of my time? But some apps rely on advertising that takes up the entire screen. This is very distracting and can accidentally trigger a click-through to a loud commercial when it’s entirely inappropriate. Ads should be visible, but not distract from the app experience. A good developer would simply place a small price tag on an app from which he or she wanted to guarantee revenue. Large, distracting ads are bad business because they foster negative sentiment for the app and its developer.
This is small, but very important to me: The name that appears below the app icon should be short enough to fit without elipses (“…”). The character limit for naming an app on the home screen is about 12, and it seems amateurish to create a well functioning app with a nice icon and not spend some time making the name short enough to be readable. This can be tricky, and might result in a more anonymous acronym like “WRS” for “Wild Rabbit Shooter”, but hopefully the app icon is more memorable than the name that appears below it. Speaking of icons, apps that have multiple versions, or are released as both freemium and premium, ought to denote these differences with a small banner on the icon. I have the WeatherBug app, which I like, but now that it’s available as a paid “Elite” version too, the free app is called “Weathe…Free” on my home screen. This is unfortunate because with such a simple logo, adding the word “Free” on the lower right diagonal of the icon would have been a simple feat that went a long way toward a cleaner and more pleasing appearance.
Apps don’t exist in a vacuum. Sharing and creating connections has become not only normal, but necessary for many organizations and individuals (plus it’s fun). While some apps can’t work properly without app connectivity to other online services (Pulse, SoundTracking) others benefit greatly from this ability. HalfTone allows users to turn photos into gritty halftone images, and even add comicstrip captions and crumpled or slightly yellowed photo borders. By itself this app is fun and worth 99 cents, but the ability to instantly upload finished photos to email, Twitter and Facebook put this app over-the-top. Apps that allow for communication with other services are a staple in this interconnected world, and absolutely necessary for sharing ideas, even if one of those ideas is a cartoon version of your friend.
24th March 2011
With over 350,000 apps currently available through Apple’s App Store, it’s truly a buyer’s market. And while many apps perform similar functions, not all apps are created equal. From the silliest games to turn-by-turn driving direction apps, I have minimum standards for what I look for in an app:
Try before you buy…
I won’t begrudge paying 99 cents for an app that may be terrible. After all, ratings go a long way in helping me decide to spend a buck. But if an app costs more than $3, there ought to be a freemium version. I know that some people (myself included) will drop $3 on a cup of coffee in an airport and think nothing of it long after the caffeine rush is gone. Some $2.99 apps might be far more useful than one cup of coffee, but for some reason apps feel more expensive than they are. Maybe it’s because I’m not buying a physical product, so the decision feels like more of a gamble. This cartoon from The Oatmeal illustrates the concept perfectly. In any case… Developers, I’m neurotic about spending money on apps, so please let me try before I buy!
Low crash rate…
Not all crashes are the fault of the app. Not all crashes are the fault of the app. Smartphones are mini-computers after all, and they occasionally need a hard restart to fix weird behavior, like apps crashing that normally don’t. But some of the apps I like to use regularly will crash even after a phone restart. Granted, apps don’t always age gracefully, especially free apps where the developers have moved on to other projects and are no longer providing updates. However I expect that an alarm clock app, whose only function is to keep time and play music at a specified hour, will continue to work through firmware updates and not crash in the middle of the night, leaving me late for work.
We live in an age of infinite distraction. There are multiple to-do lists around my apartment, on my laptop, and in my head. I don’t always get to them. I still remind myself to write a Yelp review for that nice restaurant I visited… in January! Many apps are of great use to me, but I’m always forgetting to rate them, especially after I’ve used them for a week or so and have decided to keep them on my phone. They sort of become part of scenery of my digital life, and unfortunately for the developers, I forget to give them 4 or 5 stars and a helpful review. I’d like to see apps remind me to write a review—not when I first open them, or every time I open them, but maybe on the 5th open. And I’d find it very helpful if they could link me to the App Store directly, so I don’t have to go looking. Help me help you, developers!
Check here 31 March 2011 for Part 2!