"Daddy, What Was Life Like Before the Internet?"

That is a question I will never have to answer because Mike Huckabee's chances of becoming the next President of the United States are far greater than me being a father—although not a Daddy (wink, wink)—at this stage in my life.

But it's still a valid question. While science has proven that our perception of the passage of time changes as we get older, it still seems I had more free time than I knew what do with before the arrival of ubiquitous devices into my life. I remember pre-PC Revolution not having to make time to see a movie, or go to a mall, or go to the beach (the first casualty of life in San Francisco after a computer arrived in my apartment), or when I lived in Tucson, drive out to Reddington Pass, take a hike and expose my totally bare bits to nature.

You were expecting to see my exposed bits? Not a chance.

I used to paint, and while there were periods prior to devices that I went years without creating anything, the last time I picked up a brush was nearly eight years ago—and that's rapidly closing in on a record. I'm not that concerned about that particular activity because my Muses have always been fickle bitches, but it seems I just can't find time to do a lot of the other things I used to enjoy and always had time for—like wandering around downtown taking photos. That was something I did almost every weekend—if not more often—and now it seems to be a special occasion when I actually can get around to doing it.

And how did we live without Google and Wikipedia? It seems funny now, but once upon a time I was actually able to do my job with just the knowledge I had in my head. I also used to know what every single file in Windows (okay, it was version 3.1, but still) did. Now the whole tech field has become so…complicated…knowing everything about everything is simply no longer possible, and it seems a day doesn't go by that I don't have to refer to the Google for the solution some obscure problem (both Microsoft and Apple related).

Part of me really wants to just unplug, but on the other hand, so much of my life is wrapped up on these machines now it's all but impossible unless I print out everything and keep hard copies. Do I know anyone's phone numbers beyond Ben and my sister? Don't be ridiculous. Do I know what I have scheduled for next week, or next month? Not a clue unless I bring it up. Some birthdays I remember, but I still need to double-check my phone when we're out shopping to make sure I don't neglect to buy cards. Do I know any of my bank account or credit card numbers? I used to possess that knowledge prior to being online, but now I can remember maybe the last four digits of one or two accounts. And now with all my ridiculously complicated passwords safely stored away in a secure vault program, do I even know more than a smattering of those? Ha ha! That's funny!

I've always been interested in tech, so it's no surprise I was a fish to water when this stuff first started arriving on the scene, but I wish I knew how to regain some of that free time that I used to enjoy without having to purposely carve it out.

Beautiful



Entire generations will never know the joy of the "big iron" era of audio equipment. Pictured: Sony TAE-5450 preamp and TAN-8550 power amp. $1000 and $1300 respectively, in 1975 dollars.

Even now, finding people who have the requisite knowledge to keep this tech working are getting harder and harder to find.

Why Can't They Just Put it all on One Chip?

This was a question I initially posed to a friend of mine back in high school while we were discussing one of our favorite shared passions, audio equipment. Digital amplifiers were starting to appear on the scene, and I wondered aloud why all the circuitry couldn't be shoved on a single slab of silicon and be done with it. My friend (who knew electronics on that level) said it was theoretically possible, but you wouldn't be able to build an amplifier in the 80-100 watt per channel range we were currently enjoying because of the heat and power constraints. This was the age of "big iron" in audio, after all.

A few years later when I first started getting into personal computing I found myself asking the same question. Why can't all this crap be put on a single chip, or at least on the main system board?

Why did I need separate plug-in cards to control hard drive? Why did I need a separate card to control the I/O functions? Why did I need a separate video card? WHY COULDN'T ALL THIS STUFF BE ON A SINGLE BOARD?

Well, for starters, at the time the technology just wasn't there; we're talking the 8088/286/386 era, after all.  The first few computers I owned (built from parts I'd gotten at computer fairs) had banks of discrete memory chips on the system boards and a 20 MEGAbyte drive was considered big! Good luck trying to troubleshoot a bad chip if you ended up with one. That's why I was dancing in the streets when the first DIMMs started to appear. Imagine that: four (or however many) memory chips soldered to little circuit boards that just snapped into the system board!

As the years progressed, I was happy to see that I wasn't the only one who had been wondering why all these discrete items couldn't be made part of the system board, because slowly drive controllers and I/O found their way onto motherboards, and before long, even video was becoming a standard part of the build. You could still buy souped-up peripheral cards, but they were no longer a necessity to build a functioning system.

And now here we are in 2015:

Just look at what Apple's done with the latest MacBook. We still aren't to the point where everything is on a single chip, but we're damn close. That tiny system board not only contains the CPU, memory, and controllers, but also the machine's solid state "hard" drive.

Being the inveterate nerd that I am, I've always taken great pleasure in peeking at the guts inside my tech, and I have to admit, as we get closer and closer to the "everything on one chip" I used to dream about, a part of me is kind of disappointed there isn't a whole lot left to look at when you pop the hood.

For Posterity

With our society's increasing reliance on digital storage, it occurred to me the other day that all the sounds and images we're amassing and storing will—in all likelihood—be irrevocably lost to future generations because of the unstoppable pace of technological change that's barreling down upon us. Not even the NSA itself will have access to the petabytes of data they're amassing in fifty years unless it's constantly refreshed and translated to the latest formats. And I seriously doubt anyone's got time for that.

The ancients knew what they were doing. Stone tablets—barring their outright destruction—last for millennia. Paper can last for centuries if properly curated. Digital media…not so much. If "bit rot" alone doesn't rob our descendants of our history, the mere fact that all the formats currently in use will no doubt be obsolete and unreadable in less than the span of a human lifetime.

This is already a becoming a problem. Have you ever tired to open a document created in the original version of WordPerfect? (Is WordPerfect even still a thing?) Yeah, a basic text editor can still pull out the important information, but the time required to remove the machine code and reformat that information into its original form is horrendous. I ran into this recently while trying to retrieve the Journals I'd written in the late 80s and early 90s.

Don't even get me started on image formats or anything done in old desktop publishing programs. Anyone remember Ventura Publisher? Just try to open one of those documents. Good luck.

I guess what I'm trying to say is that in our rush to digitize the world and the ease it's provided in recording the minutiae of our lives (anyone remember the limit of being able to record only 12, 24, or 36 images at a time on film?) we're ultimately in danger of losing it altogether.

Amen to That!

Rene mirrors all my feelings about Instagram. From iMore:

I've been using Instagram for a long time. I almost always post my photos to Instagram and then share from there to Facebook and Twitter. I use it enough that I've become somewhat inured to its common limitations — its rough edges and brick walls — most people run into every day. Hearing John Gruber discuss them with Ben Thompson on The Talk Show this week, however, made them all front-of-mind for me again. As such, here are five frustrations I currently have that I'd like to see Instagram tackle this year.

1. Higher resolution photos

Instagram launched at a time when the iPhone's camera was three megapixels and the device's top speeds were limited by 3G service. Now we have 8-megapixel iSight cameras and LTE radios. Yet Instagram's native resolution on iOS remains 640-by-640. For viewing, that's okay… in the same way listening to a low-bitrate MP3 music is "okay." For appreciating, however, it's disappointing.

On a modern iPhone, Instagram works with and even lets you save 2048-by-2048 images, but won't display them on the service. Perhaps the company could upload the high-res image to the service if you have the connection to do so, then display scaling resolutions based on whatever device is viewing it, a la Dropbox's Carousel (née Loom) or iCloud Photo Library. That way low-end, low-resolution devices are still "okay," but high-end devices like the iPhone 6 Plus can view the high resolution — assuming they have the bandwidth to do so.

Storage and its associated costs would go up for Instagram, but they have Facebook money and infrastructure to lean on now. And for a service built on photos, a reasonable quality level should be table-stakes.

2. Instagram for iPad

Instagram debuted on the iPhone. Eventually it moved to Android and the web (ish). It added video and dabbled with messaging. What it didn't move to, and what it didn't add, was an iPad client. This thinking might have made sense when the iPad launched without a camera, but now that the iPad Air 2 has a very capable 8-megapixel iSight camera, it's an obvious, gaping omission.

Given its 2048-by-1536 laminated Retina display with 264 pixels-per-inch, the iPad has the perfect screen for photos and videos. That means it has the perfect screen for enjoying high-resolution Instagram shots as well.

Mobile-first defines Instagram, to be sure, but iPads are mobile devices. Apple has sold hundreds of millions of them, so they're also among the most popular mobile devices and the most popular mobile cameras. They are, in fact, the best cameras some of us have with us. It's long past time iPads and iPad photography were treated as first-class citizens.

3. Non-destructive edits

Instagram has grown from a way to filter poor iPhone shots to a way to edit-to-perfection all the moments we want to share with the world. The app has a full range of tools, offering expanded filters, brightness, contrast, saturation, color, highlights and shadows, blur and sharpness, and more.

During the editing process you can move through all of them, tweaking and adjusting with impunity — non-destructively – until it's exactly the way you want it. Once you post it, however, those edits are burned in forever.

Imagine instead, like the iOS 8 Photos app, all those edits were saved as a delta file and stored alongside the original image. What you see in the Instagram feed would be the same, but you could go back in and tweak an image again any time you wanted, alongside its caption or tags.

Again, it would require slightly more storage on the backend, but it would also allow for more time spent using Instagram — a plus in any company's book.

4. Native regrams

If I like a tweet, I can retweet it. If I enjoy a Facebook post, I can share it. On Instagram… nothing. If I really want to re-broadcast something, I need to screenshot it, crop it, and post it either as my own photo with some caption attribution or use one of those ugly third party apps that stick a citation label on it.

At best, it's awkward and unwieldy. At worst, it leads to shitpic degradation through repeated application of compression.

Right now, to discover great new Instgram accounts to follow, I have to leave my timeline and go squint through the middling Activity list Instagram provides. Native regrams would solve that problem, as well. And for anyone who thinks of regrams as clutter, an opt-out could easily be provided in the app's settings screen.

5. Active Links

"See link in my profile" is a thing on Instagram because that's the only place the app currently supports active links. If you type a link into a caption, it renders as plaintext. Tap on it, and it does nothing but laugh at you, silently.

By all means, keep links — and the spammers who would abuse them — out of the replies. But enable them in the original caption we add when uploading a photo. That way, I can not only share moments, I can share a way for other people to get more information and potentially share that moment as well.

Your most-wanted Instagram features?

Some people want a native Mac app as well, or channels so they can follow trends or events. Others want an easy way to re-download their own photos or multiple account support so they can post personally and for the job or hobby. Still others want better state-preservation so they don't lose their place when they exit the app, or support for line breaks in comments.

But what do you want? If you were running product development for Instagram this year, what would be on your upcoming feature list?

It's Magic

Since one particular day back in 1979 when my friend Steve casually mentioned, "I was listening to so-and-so's new album while I was in the shower today, and…" I have been incredibly jealous of the fact that he had his entire house wired for sound.

This was not common in 1979, but since he worked as a DJ at one of the popular gay clubs in Phoenix at the time, it also wasn't particularly surprising.

It wasn't until many, many years later while living in what was to be my last San Francisco apartment, that I was finally able to realize my dream of being able to have decent sound in the bathroom while I was showering.  I ran wires from my rig in the bedroom down the hall and into the bathroom, where they connected to some small Infinity satellite speakers I'd picked up. When all was said and done, I was admittedly kind of disappointed; it sounded great, but a lot of the anticipated thrill of doing this had disappeared during the intervening years and it got to the point that my downstairs neighbors were dropping so many snide comments about hearing 20 year old disco first thing in the morning I eventually took it all down.

When I moved back to Phoenix I wasn't able to run the wires in a way that could be as easily hidden as they'd been in San Francisco, and an extended period of being out of work forced me to sell the gear I'd been using so I gave up on having hi-fi in the bathroom.

Fast forward to 2012 or thereabouts and the arrival of Bluetooth technology into my life.

Now I have a speaker smaller than a power strip that pumps out great sound from my iPhone at a moment's notice. No running wires or having to mount anything on the wall. Tunes in the shower anytime I want.

It's magic.

Microsoft's New Mac vs. PC Hyperbole

From Rene Ritchie at iMore:

Microsoft has three new Surface Pro 3 ads out today that, as promised, switch from trying to attack the iPad to trying to attack the MacBook Air. Given how heart-breakingly, bank-breakingly unsuccessful Surface has been to date, it's hard not to sympathize. It's also hard not to think repeating past strategic failures will only result in more failures. Instead of shifting from iPad to MacBook, maybe Microsoft should shift from attacking Apple to attacking the PC market?

Mac sales were up 18 percent year-over-year last quarter. The Mac has grown 32 out of the last 33 quarters. That's against Windows PC sales that continue to be on the decline.

It's incredibly tough to imagine anyone would leave a MacBook Air for a Surface Pro 3. More specifically, that they'd leave the ability to run OS X on hardware of that caliber for Windows 8 on anything. Especially because the MacBook Air can run OS X and Windows 8. Putting Windows on a tablet turned out to be a liability not a feature and it looks like the same is going to hold true for hybrids.

People who use Macs use them intentionally. We love not only the quality of the hardware but the experience and workflow enabled by the software, by OS X and iLife and iWork and all the OS X-exclusive apps by Panic and Flexibits and Tapbots and Aged & Distilled and SuperMegaUltraGroovy and The Iconfactory and, many more. That's simply not attainable by PC hardware, and certainly not by the widely-maligned Windows 8.

It's telling that "run Windows instead of OS X" wasn't even suggested as a benefit in any of these three new ads. When Apple ran their famous "Mac vs. PC" series, OS X being better and preferable to Windows was almost always front-and-center.

Microsoft does mention running Office and Photoshop, but both of those apps are available on the Mac. Office is available for iPad now as well, as are really great detachable keyboards. They also suggest you need a paper note book to use a pen with Apple products, which, given the stylus market for iPad, is either ignorant or deliberately false.

I'm almost tempted to suggest Microsoft would be better off running an ad encouraging OS X customers to buy a Windows license for their Macs, to get the "best of both worlds", but again, given how poorly Windows 8 has been received, that probably wouldn't help very much. Maybe focus on Bootcamp and gaming?

I'm even more tempted to suggest Microsoft shouldn't focus on Apple at all, and go gunning for Dell, HP, Lenovo, and other PC vendors instead. People who buy PC laptops and hybrids are already Windows-only customers. All the things Microsoft is actually showing off in their ads — great specs, capacitive touch, pen input, etc. are probably something Windows-only customers would be really interested in. Hell, for anyone used to the creaky plastic and gaudy stickers of many Windows laptops, Surface could be a welcome upgrade. Even for people with higher end PC ultrabooks, getting something not painfully, slavishly derived from Apple design could be a breath of fresh air.

Given the politics involved in Microsoft's OEM partnerships, however, I don't think we'll ever see that happen.

More…

Who Even Buys Physical Media Anymore?

I think it's a legitimate question, and one that popped into my head while inventorying about two dozen BlueRay players that were unceremoniously dumped into our department the other day. They were mostly prototype units provided by manufacturers for testing, or outright trash…ridden hard and put away wet by Testing & Quality Assurance, but one or two (a Yamaha A-1000 in decent condition for instance) caught my eye and I found myself thinking that since we don't actually own a BlueRay player I should probably keep it in mind when the department does its annual garage sale—especially since we get first dibs on anything being sold and can probably snag it for ten dollars or so.

But then I thought, "Why?"

I haven't bought a DVD since Prometheus (yeah, yeah, I know…) came out, and with the exception of The Aadams Family/Adams Family Values disk that Ben got me after catching it on TV one night a couple months ago, we haven't purchased any physical media at all.

It's kind of funny, actually. Back in 2005 or thereabouts when iPods were first becoming "mainstream," I was a holdout. I owned hundreds of CDs, and my holier-than-thou audiophile attitude was that MP3 encoding was shit. I was still using minidisks for my portable and in-car needs, and I couldn't fathom getting rid of physical media. I mean, I grew up in the age of vinyl, for chrissake!

But then I met Ben, and things changed.

I bought my first iPod a few months after I got my first Mac and found myself loving it. Maybe there were a few sonic glitches here and there, but I also realized that my high-range hearing wasn't what it was 20 years ago and I couldn't really hear the difference anyway.

Suddenly (or rather, not so suddenly, as it took months to rip all those CDs) I was able to carry my entire CD collection around with me. I then started ripping vinyl, and not very long afterward (okay, it was a couple years) I had my entire collection in my pocket.

Something similar is happening with my relationship to video, although I've never been as intimately involved with movies as I was with music. At one point I owned a hundred or so movies on disk. But when I was out of work a year ago and had to come up with cash, I realized it was time for the majority of them (along with my CDs) to go. Thankfully there's still a market!

I kept a couple dozen CDs that had some sentimental value, and a dozen or so DVDs that I actually have watched more than once and want to keep for that reason, but by and large rest of it was sold off.

And I haven't missed any of it. New music is purchased directly through iTunes (or, acquired through other sources). With Hulu and Netflix, pretty much anything I'd ever want to watch is available on demand.

Much like I proclaimed in 1977, "I'll never see that stupid Rocky Horror Picture Show movie!" only to stumble upon it at a friend's house ten years later and immediately fall in love with it, in 2005 I proclaimed I'd never get rid of my CDs. Both proved ridiculously shortsighted.

This.

Are you sure you want the Internet of Things?

From Mat Honan at Wired:

I wake up at four to some old-timey dubstep spewing from my pillows. The lights are flashing. My alarm clock is blasting Skrillex or Deadmau5 or something, I don't know. I never listened to dubstep, and in fact the entire genre is on my banned list. You see, my house has a virus again.

Technically it's malware. But there's no patch yet, and pretty much everyone's got it. Homes up and down the block are lit up, even at this early hour. Thankfully this one is fairly benign. It sets off the alarm with music I blacklisted decades ago on Pandora. It takes a picture of me as I get out of the shower every morning and uploads it to Facebook. No big deal.

I don't sleep well anyway, and already had my Dropcam Total Home Immersion account hacked, so I'm basically embarrassment-proof. And anyway, who doesn't have nudes online? Now, Wat3ryWorm, that was nasty. That was the one with the 0-day that set off everyone's sprinkler systems on Christmas morning back in '22. It did billions of dollars in damage.

Going back to sleep would be impossible at this point, so I drag myself into the kitchen to make coffee. I know this sounds weird, but I actually brew coffee with a real kettle. The automatic coffee machine is offline. I had to pull its plug because it was DDOSing a gaming server in Singapore. Basically, my home is a botnet. The whole situation makes me regret the operating system I installed years ago, but there's not much I can do. I'm pretty much stuck with it.

When I moved into my house in the 20s, I went with an Android-compatible system because there were more accessories and they were better designed. But then I changed jobs and now my home doesn't work with my company-issued phone. Which is a bummer because I have to keep this giant 7-inch tablet around to control everything and Google doesn't support the hardware anymore so I can't update it and now the door just randomly unlocks. Ugh, I'm going to have to start using keys again.

I'd just reinstall the OS, but that would be too expensive. Besides, all my Nexus Home® stuff uses proprietary chargers, and I can't deal with having Amazon drones come in and rip out the drywall again.

Everyone thought the connected home would be Apple or Google's game. Turns out, that was short-sighted. An Internet-connected thermostat? LOL. Of course it was entirely about who would gain control of your SmartWall. It was the thing that controlled the screens and the lights and alarm clocks and burglar alarm and outdoor atmospheric monitoring system and interior climate control and mirrors and irrigation system and solar collector and water filtration and grocery inventory management database and kitchen appliances and communications center and automobile docking system and exercise equipment and biofeedback monitoring and medicine dispensary and stereo that mattered. But in fairness, who could have foreseen the Microsoft-Samsung deal or its consequences?

"좋은 아침입니다, Mat" my oven chirps through the speakers in the ceiling, as I place the kettle on the induction element. "조용히," I mutter.

So I just replace things here and there as they quit working. Which means I've got a mishmash of Apple, Android, and Samsoft components all cobbled together. Nothing works exactly right. It's a huge mess.

As I plod through the kitchen, my floor lights up, exposing rows of flashing LEDS, and a snippet from an old Queen song starts to play. "Congratulations!" purrs my house in an Elvis Presley voice. "You've just hit your step goal for the day!" Years ago I reset the step goal to 20 because I was tired of my house nagging me all day. Every time my couch vibrated or my TV told me to get up and walk around, I found myself resenting my home a little bit more.

I sit down with my coffee and fire up the short throw projector embedded in the kitchen table. The news is depressing, so I flip through a Redfin search I started last night in bed. There are these houses up in Humboldt County that are listed in the inundation zone, so they were never required to upgrade. That was a cartography error; even if sea levels go up another 20 feet they would still be above the water line. They're rustic, and don't even have high energy automobile docks. But the idea of getting off the grid really appeals to me, even if it's just a fantasy.

The skylights open up. The toaster switches on. I hear the shower kick in from the other room. It's morning.

This reminds me a little bit of a short story I read as a child by Arthur C. Clarke (At least I think it was Clarke; it was so long ago I could be wrong) Ray Bradbury (thanks guys!). It was about an intelligent house that took care of its family's every need, from cooking their meals to cleaning and mowing the lawn. But something went wrong and a fire broke out, spread, and it could not be contained. As it turned out, the house had been unoccupied for some long, unspecified length of time; the only indication that there was a family ever there were the atomic shadows left on an outside wall…

Why Can't it Just be Simple?

Once upon a time I got my internet, cable, and seldom-used land line telephone from one provider. Life was good. Things were simple.

Then we moved to Denver.  My previous provider didn't have a presence in Colorado, so we went with Comcast for everything. (We actually dumped the land line shortly after getting it because all we were getting were collection calls for whomever had the number before we did and haven't regretted that decision one bit.)

Service with Comcast is okay—as long as you don't want to make any sort of change once you're set up. Then it becomes a trip through the seven levels of customer service hell; something we discovered about a year ago when we  switched from our reliable cable card-Tivo setup to Xfinity's native DVR and—having realized what a horrific mistake that was— switched back.

Ben's mom, a newly minted Comcast employee nine months ago, took over our account in order to get basically free service. It was a nice perk, but we thought that when she moved out we'd just take her off the account and everything would go back the way it was before.

Not so fast, Buckaroo! Comcast informed us that we would have to set up a new account since Julie was now the account holder. In addition, they would have to send a technician out to  put a new cable card in the Tivo (a $70 charge, and no we couldn't simply go to a Comcast store and pick one up ourselves) because the existing card couldn't be switched over.

Not wholly unexpected, especially since that woman has destroyed so many other things since her arrival here. But still…

Fuck that.

The one thing I feared the most was losing my Comcast email address because it was tied to pretty much everything, but I bit the bullet last week and moved it all over to my iCloud address. Yes, it was a major pain in the ass, but since it was obvious I was going to lose the Comcast addy, it had to be done. (And now I know why so many people have addresses not tied to a specific service provider.)

I balked at the thought of having to go back to DSL speeds, but with Comcast out of the picture we had no other option. We signed up with Century Link for the 40 Mbps tier. I'm not holding my breath…

Even though I'm now working for one of the "other" content providers and am eligible for basically free television service, I haven't rushed to get signed up because I was reasonably certain that our west-facing apartment wouldn't afford the proper line-of-sight needed to the satellite. Our only other choice was my company's competitor, but after having horrific experience with that company in Phoenix, Ben swore he'd never go back to them.

So I signed up for my free service. The biggest problem we initially faced was that our building was pre-wired for the competitor's system, and the landlord told us in no uncertain terms we could not use other. Well fuck that too! As it turns out, the FCC has certain rules in place that allow us to do exactly that—as long as it's not specifically spelled out otherwise in our lease (it isn't), and we don't physically attach anything, drill holes, or make changes to the premises. In other words, a dish on a tripod on our balcony running a flat wire under the door to the inside is fine.

Well, the installer came out today and confirmed what I'd suspected. We don't have a clear line of sight to the satellite without mounting the dish on an extension arm on the balcony railing. That we can't do.

So that leaves getting our service from my employer's competitor that has central dishes already mounted on the building. We have no other choice at this point. Hopefully our experience with them will be better than what Ben went through on his own in Phoenix.

And of course we can't schedule a time when both companies can be here on the same day so that there's minimal time off work and loss of service. That would be too easy!  We supposedly won't be without television since that's getting moved over the day before Julie switches her service on at her new place, but we'll be without decent internet for the better part of the week, having to rely on cellular LTE tethering.

I know, I know. It's all first world problems, but fuck! Why can't this shit be simple?

It's Only a Matter of Time

The rumor mill has been abuzz about the expected iPhone 6 supposedly coming out sometime this later year. Personally I don't care about these rumors one way or another because I learned long ago that the majority of them are complete bullshit—as exciting and inspiring as they may be. I only need to look back a couple years before the introduction of the iPhone 5 to see images of "radical" redesign. My favorite was a wedge-shaped phone similar in design to the MacBook Air:

 I was sorely disappointed when that did not come to pass. So I have no faith in any of the admittedly beautifully rendered speculations on what the next phone will look like. It is what it is, and we'll all see it when the time comes.

One thing I am hoping for is a 128GB capacity option—and that's for one reason only: music.

My iTunes library is currently hovering around 100GB and I would love to have any of it available on a whim, instead of having to manually shuffle songs in and out of my current phone. Apple hasn't updated the iPod classic (160GB) in years, so a device of any kind with this capacity is overdue.

Of course with their current focus on the cloud, I'm sure they're simply hoping that those of us with huge music collections will just subscribe to iTunes Match and have all our music stream.

All well and good, except I (and I suppose many others) have hundreds of songs that aren't available in iTunes; songs lovingly ripped and edited from the original vinyl. Plus, why should I have to pay again for access to things I already own?

Anyhow, that's not really the thrust of this post. "It's only a matter of time" refers to the eventual day that Apple puts out a much-rumored "iPad Pro," a device that will finally be able to replace the average person's laptop in its entirety. 256GB flash storage? Why stop there? Let's go for 512GB or 1TB. I know it's not economically (or physically practical) to create this kind of device today that will be as thin and light as the current iPad, but a few years out? We'll all be laughing at the ancient relics with "only" 128GB of internal flash storage.

It's said that the majority of people—even those of us who own an iPad in addition to a laptop—have not given up on our laptops completely is that while consuming content from a tablet is its primary attraction, it's still much easier to create content on a laptop (or full-blown desktop) than it is on an iPad…although, again, I'm sure it's just going to be a matter of time until the available software tools and hassle-free connections to multiple external monitors also make that a moot point.

Obviously, this won't happen overnight. Nor do I expect laptops to be abandoned in 2, 3, or even 5 years down the road. But a decade from now? I fully expect the dominant platform will be tablets in one form or another.

Never Again

Tonight I thought I'd install Windows via Bootcamp onto my Mac so I wouldn't have to bring that stupid HP laptop home from work on the rare occasions I needed to.

What a mistake that was.

Yeah, it installed easily enough, and surprisingly, it actually worked. That is, it worked until I downloaded 135 software updates and after rebooting, the network adapter disappeared.

It's shit like this that caused my initial move to Apple back in 2009.

Fuck you, Microsoft. Windows will never be installed on anything I own ever again.

Ouch.

Can't say I disagree…

"Windows 8 is a disaster in every sense of the word.

"This is not open to debate, is not part of some cute imaginary world where everyone's opinion is equally valid or whatever. Windows 8 is a disaster. Period.

"While some Windows backers took a wait-and-see approach and openly criticized me for being honest about this, I had found out from internal sources immediately that the product was doomed from the get-go, feared and ignored by customers, partners and other groups in Microsoft alike. Windows 8 was such a disaster that Steven Sinofsky was ejected from the company and his team of lieutenants was removed from Windows in a cyclone of change that triggered a reorganization of the entire company. Even Sinofsky's benefactor, Microsoft's then-CEO Steve Ballmer, was removed from office. Why did all this happen? Because together, these people set the company and Windows back by years and have perhaps destroyed what was once the most successful software franchise of all time." ~ Paul Thurrott

A Question for the Hive Mind

I want to create a centered, borderless 2×2 table to enclose these images. Ideally I'd like to have a small white space between them both horizontally and vertically, but at this point I'd be happy just to get the entire group centered with no borders. How do I do it? I know only basic HTML and no CSS, but I also know I must have at least one reader out there who knows how to do this in their sleep and can send me the necessary code to copy and paste…


Ouch.

This one's gonna leave a mark:

"Then came Windows 8. It was truly awful at a fundamental level that Windows 8.1, 9, or 10 can't fix unless Microsoft destroys their faltering mobile ambitions once and for all. They won't. Keep in mind that is a fundamental design choice that is the very basis of the OS design, not a few features that need polishing. Apps won't come until the user base comes up and IDC has backed up SemiAccurate's long-held view that this isn't going to happen. Note the lack of a qualifier like soon, 2017′s numbers are not close to enough even if IDC's rosy picture pans out. And in light of this, Microsoft forced current and potential customers, pick a tablet OS, us or them, the choice of a real OS has been completely removed by executive fiat.

Stop for a second and ponder the magnitude of this stupidity, the sheer negligence that has to be rife among Microsoft managers and senior executives to allow this. They knew what they were doing and did it anyway, Microsoft knows what users want and made the conscious choice to not only offer something different but to also deny customers the possibility of getting what they needed to work. Every objective third-party said it would be a disaster but Microsoft carried on. On top of this Microsoft gave users another choice, abandon your current Android/iOS apps, music, movies, knowledge, social links, and devices and come to our OS that doesn't have any apps, music, movies, inherent knowledge, social links, or devices instead.

Worse yet by forcing the new miserable and counter-intuitive UI on people and not allowing legacy Apps on the new store or on WART/Windows Phone, they quite effectively told people you can't take anything from your current Windows software catalog and knowledge base with you either. Developers had to recode their entire software base for something that had a vanishingly small marketshare and make something incompatible with the billions of PCs currently in use. For some reason they didn't. Buyers would have to abandon much of what they had bought for Windows 7 and earlier, some software, some media, habits, and many devices that worked for them and that they liked for something that didn't have apps, didn't work well, and was truly unpleasant to use. For some reason buyers are not flocking to Windows any more, just the opposite.

It all comes down to Microsoft. The company is so blindingly mismanaged that it is almost impossible to describe in mere words. Microsoft painted themselves in to a corner knowingly. When it was clear that the Windows 8 master plan had unquestionably failed they simply kept going because the situation they engineered meant all other options were far worse. None of these options can succeed, it is just a quest of how bad the outcome will be and how soon it will happen. The company has knowingly destroyed their rock solid desktop monopoly to chase a dream in a way that had absolutely no chance of succeeding. They had to know this before they started, they may be myopic, single-minded, and stubborn but they aren't stupid. That said they have still failed."

Preach!

Words I could've written myself, from Adrian Kingsley-Hughes at ZDNet:

"These are words I never thought I'd be writing.

After more than two decades of being a dedicated Windows power user, someone who over that time has installed and supported countless systems running versions of Windows spanning from 3.0 to 8.1, I've now all but given up on the platform.

It might sound odd, but writing these words actually makes me sad. I devoted my 10,000 hours to mastering the platform, plus thousands more, and got the point where there wasn't a file, registry entry, or command line trick that I wasn't familiar with.

I knew how to make Windows work.

But now, other than for test systems and virtual machines, I carry out my day-to-day work on a variety of OS X, iOS and Android systems. I barely give my Windows PC systems a second glance. My primary work system is a MacBook Pro, and in the ten months I've had it it's flawlessly done everything I've asked of it, from run Microsoft Word to render 4K video. I've lost count of the number of notebooks I've owned over the years, but this MacBook Pro is, by far, the most reliable system I've owned, and I put part of that down to the fact that it doesn't run Windows.

Sure, I've downloaded and installed Windows 8.1 onto a number of systems for testing, and I've put an awful lot of hours into getting to know this latest release of Windows, but I see nothing in this new version that excites me sufficiently to tempt me back into the Microsoft ecosystem. If anything, the effect has been the exact opposite, confirming my belief that parting ways with Windows was the right thing to do.

So what's bought me to this point in my tech career?

Support fatigue

I've spent almost my entire adult working life involved with PCs, and the more PCs you are around, the more sick and dying PCs you encounter. And I've encountered a lot.

I've also cajoled and coaxed countless ailing systems back to life, but during that time I've come to realize how fragile the Windows operating system is, and how something small and insignificant as a bad driver, incorrect settings, or the stars being in the wrong position can bring a system to its knees, and result in hours of work searching for a solution. That's great if you're being paid by the hour to solve PC problems, but if your dealing with your own systems, and you have better things to be doing with your time, then you want to get them up and running as fast as possible so you can get back to real work.

Troubleshooting is costly, time-consuming, and frustrating, and while I once used to relish the challenge, I now try to avoid it whenever possible.

Of all the desktop operating systems that I've used, the modern Windows operating system is by far the most fragile. It didn't used to be like that. I had Windows NT 3.5/40 systems, and some Windows 2000 machines that were rock solid. Partly this increase in fragility is down to the vast ecosystem of hardware and software it has to support, and partly it is down to the years of legacy that each version drags behind it. But part of the blame also lies at Microsoft's door for not putting enough effort into hardening the system, reducing the effect that fault – in particular software faults – have on the system, and providing better information when things go wrong.

Adding a 🙁 to the Windows 8 BSoD screen isn't enough.

Windows systems keel over, and most of the time the only clue you have as to why is an ambiguous error message, which may or may not be a red herring. This sends you to Google – or Bing – in search of others before you who have suffered a similar problem, and whom you hope may have found a solution, which might be in the form of an updated driver, a registry tweak, command line incantation, or patch.

Sometimes you get lucky. Other times you have to try a number of things before you're successful. And sometimes you end up deciding that it's quicker to nuke the system and start from scratch.

And all the while I'm doing this, precious time is slowing through the hourglass.

The shift to post-PC devices

Another reason why Windows has been relegated to the sidelines at the PC Doc HQ is the proliferation of post-PC devices such as smartphones and tablets.

Now I've been using mobile devices for years, and remember Windows CE and the like running on devices with exotic sounding names such as iPAQ and Jornada (remember those?), but these devices were, without a doubt, companion devices. Basic operations such as installing software or moving data required a PC, and so these devices spent a lot of their lives tethered to a Windows PC.

Then Apple changed everything, first with the iPhone, and then with the iPad. Here were devices that were standalone, leveraging over-the-air software downloads and updates, and cloud storage.

I found that I could do more and more with less and less. Tasks that once required a full-blown desktop or notebook PC could be carried out faster and more efficiently on a smartphone or tablet. Unless I want to use full-blown applications such as Microsoft's Office or Adobe's Creative Cloud suite, then I can make do with post-PC devices. What's more, I can usually get things done faster since I'm not tied to my desk.

And the great thing about these devices (and I'll throw Android in here with iOS) is that they're there when I need them. I've had an iPhone and an iPad for years, and I can only remember a couple of times when they've let me down.

My experience of Windows on tablets closely resembles that of my ZDNet colleague James Kendrick. Bottom line, they let me down too much to want to bother with them. Why would I trade a reliable iPad or Android tablet for an unreliable Windows 8.1 tablet? Why trade a tablet that just works for one that regularly sends me on quests, roaming the Internet looking for the right elixir to fix the system?

Any hopes I had that x86 versions of Windows would be more stable on tablets have gone. In fact, in my experience, the user experience is worse. Sure, most of the time the problem comes down to a rogue drivers or a configuration thrown out of whack, but a problem is still a problem, and these are problems I don't experience with iOS or Android.

Bill Gates was right, there was a market for tablets. Unfortunately, most of those tablets would be powered by operating systems made by Apple and Google. But then, Apple and Google didn't try to shoehorn a desktop operating system onto tablets.

Windows RT is certainly a better choice for tablets, but that's because what you have is the illusion of Windows, rather than the real thing. If Windows RT had come out at around the same time as the iPad, and the software ecosystem matured at the same pace, then Windows RT would be a real contender, but as it stands right now there's little reason to choose it over iOS or Android.

Unless, that is, you want something that look like Windows. Which I don't.

The increasing irrelevance of the operating system

Once upon a time, the operating system was the platform on which people ran applications, but as more and more local applications have been replaced by services running on remote web servers, increasingly the browser has replaced the operating system as the primary platform.

Twitter, Facebook, Gmail and countless other web-based services look the same whether I'm using Windows, OS X, or even Linux. On smartphones and tablets, I have the choice of accessing most of these services either through a web browser or a dedicated app.

It doesn't matter what operating system is running my browser, so I'm free to choose the platforms that give me the least headache.

Change for the sake of change

One of the biggest problems I have with Windows is the way that it inflicts change on the user for no logical reason.

For me, Windows 8 was the peak of "change for the sake of change," removing the Start Menu and pushing the Desktop into the background. Yes, I understand why Microsoft needed the Start Screen (because the Start Menu would be too cumbersome for tablet users), and yes, I understand that Microsoft wanted to give apps center stage, but for hundreds of millions of users running Windows on a desktop or notebook PCs, these changes did nothing but hurt productivity.

Compare this to OS X or even Linux distros. Here you feel a progression from one version to the next. Yes, sometimes there are changes that are disliked, but overall there's a smooth progression from one version to the next. Jarring changes are best kept to a minimum because they have an adverse effect on productivity, adding unnecessarily to the learning curve.

Microsoft backpedaled on some of these changes with Windows 8.1 (which must have been a pain for users who had gone to the effort of learning how to use Windows 8), but for me the damage was done. It's clear that Microsoft is going in a direction that's incompatible with the one I want my operating system to go in.

No appreciation of power users

Microsoft's decision to end the TechNet program, a service which gave power users, enthusiasts, and those who's job it is to test and support Microsoft products cheap and easy access to products, is a strong indicator that the company no longer values what people like this bring to the platform.

Windows is now the expensive option

Windows is now the only operating system I use where I have to pay to upgrade it.

While I don't begrudge paying a fair price for something I need, paying big upgrade bucks for something I can do without makes no sense. PCs easily outlast the lifespan of the Windows operating system, and the idea of paying almost a hundred bucks per system to keep it updated is hard to stomach when it doesn't bring me any tangible benefits.

Going the mac route might seem like an even more expensive option, but having owned a number of systems, including the MacBook Pro that that become my go-to system, the additional cost of the hardware (plus the additional AppleCare warranty) is offset by the fact that these systems have given me months, and in some cases years, or hassle-free use. I've not had to mess around with drivers.

I've not had to go digging through the configuration settings. I've not had to surf the web looking for solutions to obscure error messages.

Shift to console gaming

I used to love PC gaming, but then I got my first console.

While the graphics don't match up, and the gamepad is no substitute for the keyboard and mouse, the years of hassle-free gaming that a console offers, free from driver and patch headaches, more than makes up for the deficiencies. Not only that, but when I consider how long I've had my Xbox 360, It's outlasted several gaming PCs, which has saved me a ton of cash.

Pick the game I want, insert the disc, and BOOM! I'm playing the game in seconds. No patches to download and install, no  graphics card drivers to mess with.

The bottom line

The bottom line is that outside of a few edge cases, Windows isn't for me. If it works for you, then that's great. Stick with what works for you. I for one certainly won't sneer or look down on you or go all fanboy.

After all, I remember – with fondness, and more than a hint of sadness – a time when it worked for me.

Personal preferences are, well, personal.

Can I see a time when I might go back to Windows? Maybe, I'm not ruling anything out, but for the time being, I see Windows playing a smaller and smaller part in my day-to-day computing."

 

Analog Rules!

Like most everyone else who came of age in the 70s and 80s and had wrestled with record cleaning brushes, cartridge alignment tools, antistatic sprays and other manner of voodoo that was seemingly required to play vinyl records, I fell in love with compact discs: the convenience, the sound, the general coolness of the damned things…

And like everyone else, I bought into the marketing hype of the time. "Indestructable!" "Will Last a Lifetime!" and so forth and so on. Of course, reality has proven something quite different as we have all come to learn over the years. Scratch the wrong side of the disk (i.e. the label side) and you might as well play frisbee with the thing.

Years ago, when the news of "bit rot" (the tarnishing of the aluminum layer in commercial CDs and the fading of the dye layer in CDROMs rendering them both unreadable) came out, I thought, "Oh Jeez…my collection is disintegrating right before my eyes and I don't even know it."

But along about the same time, I rediscovered the joys of those big black analog vinyl platters. I don't know whether it was prompted by an attempt to recapture some of my youth, or I missed the music (most of which has yet to be re-released), or that I could pick up a pristine copy of some recording for $1 on vinyl that would cost me $16 on CD, or simply because the act of playing a record was so damned satisfying, but I fell in love with music all over again and realized that no matter what happened to my CD collection, my vinyl collection would survive the ravages of time.

And surprisingly, many of those old vinyl records actually soundbetter than their shiny CD counterparts.

As Ted Rall so succinctly pointed out in a column several years ago, because of our rush into the digital age, not only are we at risk of losing some of the musical treasures of our time, we're also at risk of losing most of the record of our culture in general. Even if bit rot weren't a concern, we're still facing the very real possibility that none of our digitalized history will even be readable in the years to come because of the ever-changing march of technology and the obsolescence it leaves in its wake.

And now, as we now start to abandon owning physical media altogether, trusting all our musical memories to bits and bytes in the cloud, I fear this problem has only grown exponentially. I'm not saying it's a bad thing per se, and there's certainly something magic about being able to carry—at least in the case of the classic iPod—60 days worth of music around in your pocket, but when I really want to be in my music, I pull out one of those black vinyl platters, plug my headphones into my 30 lb. wood-trimmed metal mid 70s receiver, and get lost in the sound…if only for a few brief minutes.

Makes you wonder if the ancients knew something we don't. They carved into stone tablets not because they had to, but rather, because they wanted to ensure that their legacy lived on.

One thing I know for certain: as long as someone can affix some sort of pickup needle to a phono cartridge, centuries after the aluminum reflective layer in the last of my commercial CDs has tarnished, the dye on my home-grown CDROMs has faded to invisibility and my iPod is at the bottom of a land fill somewhere, whoever comes after me will still be able to play my records.