Categories
tech

Holding onto that iPhone box

I’m not a pack rat by nature. I don’t often keep items “just in case;” I’m more likely to trash them, even if I may regret it later.

There’s a notable exception to this minimalist streak, though: technology packaging. In our shed, I have the original boxes for nearly every tech device we own: two laptops, two iPads, two Kindles, AirPods, two Magic Keyboards, an Apple Pencil, and more. My rule of thumb? “Keep the box if there’s any chance you’ll resell this someday.”

That strategy may not be logical. Sure, buyers sometimes pay more for a device in its original box. Oddly, however, many reseller sites don’t actually care whether you have the original packaging; they’ll pay the same amount, either way.

Still, I’ll keep squirreling away the boxes, regardless of the financial return. There’s an intangible benefit: the mild satisfaction of sealing an iPhone in its original cardboard coffin. I feel like I’ve fulfilled my duty, stewarding my device from its shrink-wrapped birth to its day of departure. ◾

(These items have sold since this tweet!)
Categories
tech Uncategorized

Will cutting the “Ribbon” finally fix Office for Mac?

Yesterday, at its annual conference for IT pros, Microsoft revealed a new version of Outlook for Mac. The Verge published a glimpse of the app’s revamped interface, and it looks promising—simultaneously cleaner and more useful.

We’ll see more of that refined UI tomorrow. Meanwhile, let’s examine why Microsoft might be eager to dump an interface element that has dominated its software design for over a decade.


Ribbons on the Mac

I’ve generally been a fan of Microsoft’s “Ribbon” UI, which premiered in Office 2007 for Windows. It exposed features previously buried in submenus, and it simplified the productivity suite’s legendarily complex toolbars:

The Ribbon UI was successful enough that it eventually migrated to Office for Mac. Unfortunately (and ironically), on OS X, the Ribbon created the same problem it was designed to solve: interface cruft.

More specifically, the Ribbon conflicts with a permanent fixture of macOS: the menu bar. Every Office for Mac app has two similar yet contradictory menus—the operating system’s persistent menu and Office’s Ribbon. Making matters worse, some of the heading titles in these two menus are identical, while the options within those sections are not. This unfortunate mess leaves the user with no idea where to find a given feature—the Ribbon? The menu bar? Both? Neither?

Let’s visualize this problem. In each screenshot below, I’ve highlighted the duplicative menu items. First, PowerPoint for Mac, which boasts two redundant headers:

‘View’ and ‘Slide Show’ appear in both the Ribbon and the macOS menu bar.

Next, Excel, which has three:

Word:

OneNote:

Finally, there’s good (?) old Outlook:

For each menu pair, the submenus are not one-for-one identical. They have different items, different orders, and vastly different interfaces. And there’s no obvious explanation for why only some menu titles pull double duty. Why are there two “Insert” menus but only one “Format” menu?

In addition to confusing the user, these duplicative menus cramp the interface, consuming an unnecessary amount of vertical space.


A new hope?

So… does the new Outlook, with its “Ribbonless” interface, fix these problems?

Image courtesy of The Verge.

Honestly, I’m not sure it’s the perfect solution. This interface still spends a lot of vertical pixels.

On the other hand, because I’m so excited to see the redundant menus vanquished, I’m willing to give Microsoft the benefit of the doubt. I can’t wait to try the new app.

So… what about the other Office apps? Microsoft tells the Verge that “there are no plans to announce updates to the ribbon elsewhere on Office for Mac” (emphasis mine). That’s an interesting way to phrase this statement. Microsoft isn’t denying that possibility of the feature being in their pipeline; they just claim that they haven’t planned its reveal. Tricksy.

My guess: if the new Outlook app is well-received, we’ll eventually see the Ribbon (and its redundant menus) removed from Office for Mac, for good. ■

Categories
Culture tech

Should we feel bad for loving Apple keynotes?

Today is the “high holiday” in Apple’s liturgical calendar: iPhone keynote day. In a few hours, Tim Cook and his cardinal executives will unveil the new devices designed to drive Apple’s business during the upcoming year. Apple devotees around the world will attend (virtually), eager to heap adoration on the innovations heralded from Cupertino.

That may sound a bit cynical, but the whole Apple scene is a little silly. We’ve spent the past year speculating about today’s event on podcasts, on Twitter, and in blogged think pieces. We’ve chased down a thousand supply-chain rabbit trails. Today, we’ll salivate over devices that are only incremental improvements over the ones already in our pockets and strapped to our wrists. And in the weeks to come, we’ll exhaust ourselves in post-event analysis—then prepare to hand over piles of cash to buy into the hype.

Honestly, we invest too much time and money in these keynotes, considering the serious news unfolding in the “real” world. While we focus on Apple, a hurricane is bearing down on the East Coast. Free speech is under threat throughout the country. Refugees struggle just to survive.

Should we geeks feel guilty about our self-absorption and shallowness? The answer is “Yes, probably.”

But technology enthusiasts aren’t unique in enjoying frivolous distraction from more important things. Others, for example, follow the celebrity fashion scene. They visit TMZ every hour, follow faux-celebrities on Instagram, and plan their TV-watching around which starlets guest-star on which talk shows. This world has its own “high holidays,” too—for example, the red carpet preshow at the Academy Awards. As at the Apple keynote, industry leaders parade for the cameras, sporting fashions that viewers will eagerly buy in the upcoming year.

Or consider the world’s preeminent distraction: sports, into which so many Americans enthusiastically invest free time. Every team, for example, is orbited by a cadre of sports radio hosts, newspaper writers, podcasters, Twitter personalities, team-focused TV shows, and (most of all) fan bases that consume all this media. Hardcore fans gladly plunk down thousands for game tickets, cable TV packages, team jerseys, and memorabilia. And the “high holidays” come fast and frequent: home games tailor-built for tailgating, draft days, playoff runs, bowl games. It’d be hard to argue that sports deserves this level of attention (and consumption) any more than technology.


Of course, other people’s obsessions don’t justify our own. The existence of fashionistas and sports nuts doesn’t mean that it’s okay that geeks spend so much time and money on tech.

But it helps to know we’re not alone in our penchant for expensive hobbies.  ■

Categories
tech

Imagining the future of AI photography

Portrait mode and corollary features (e.g. portrait lighting) are halting first steps towards a true AI-augmented camera. Here’s a fanciful look at our smartphone future:


It’s April 4, 2027, and Julie is making good progress. For the seventh time that day, she clambers up the squeaky attic ladder and crouch-steps her way to a tall pile of cardboard boxes. She squints at the next box in the stack, just making out her mother’s scrawl: “FAMILY PHOTOS.” It slides easily across the dusty floorboards, and Julie descends the ladder, flopping the box from step to step above her.

With a heavy sigh, she sets the box down on a dining room chair. Her kitchen scissors make quick work of the duct tape; Julie glances inside—and winces. No neat stacks, no carefully-curated photo albums. Instead, the box is full to the brim with loose snapshots, unlabeled and unsorted. Just a few years back, organizing these photos would have taken an entire weekend.

Fortunately for Julie, times have changed. She works quickly, plucking photos from the box and tossing them into a messy grid on the table. Within a few minutes, she has strewn hundreds of memories across the oak panels. They’re arranged in no particular order; Julie spots a baby photo of her grandmother from the 40s, adjacent to a faded Kodak print of Aunt Susie driving in the mid–70s. The very next snapshot in the row is a Polaroid from Christmas 1991; her little brother triumphantly lifts a brand-new video game console package above his head.

With a nostalgic smile, Julie whips out her smartphone and opens the photo enhancement app that makes this spring cleaning possible. The real work begins; she waves the device over the table, lazily panning its viewfinder across the rows and columns of snapshots.

As she does, the camera does its magic. Each individual photograph is extracted, cropped, and saved to its own file. The process is nearly instant; after just a minute or two of haphazard scanning, the app beeps and confirms that it’s captured all the data it needs. Julie sweeps the impromptu collage into a waiting trash cash.

It’s almost hard to believe how much she trusts the phone to capture these photos. Once, she would have been horrified to throw away such precious memories. Now, in a single day, she has filled a half-dozen garbage bags with old snapshots.

As she breaks down the empty cardboard box, the phone app (and the cloud service that powers it) does its own tidying up. First, it leverages machine learning to automatically recognize every object in every photo: that’s a baby beneath a maple tree in late fall. That’s a 1976 AMC Hornet. That’s a Sega Genesis.

With that context in hand, the service can clean up the photos. First, the easy stuff: wiping away physical scratches. Removing decades’ worth of discoloration and fade. Filling in missing, damaged details using robust healing algorithms. The AMC logo on the Hornet’s hood, obliterated by a coffee stain on the photo, is now recreated from a library of vintage car logos. A gouge in the leaves gets repaired too; maple leaves have a distinctive shape, and the app generates a few more to fill the hole. The Sega Genesis, motion-blurred in the boy’s excitement, is sharpened using actual product photography.

The restoration isn’t limited to inanimate objects, though. The app knows that it’s Aunt Susie who’s sitting behind the wheel of the Hornet, even though she’s obscured by glare on the windshield and some lens flare. Using Susie’s existing visual profile, the tool sharpens her face and glasses and fills in the blown-out highlights with data from other images.

The service automatically assigns metadata to each image, too. Every calendar, clock, Christmas tree, or party hat in the background helps the service narrow down the image date. Less obvious visual clues can help, too; the app might recognize that ’76 Hornet from previous scans and assign the photo of Susie to the same era. Even the girl’s tight perm could help to date the photo; given enough data, the app might know exactly when she adopted—and abandoned—that distinctive look. In the same way, visual cues could help pin down each photo’s location, too.

As Julie sets the last of the trash bags by the curb, she feels a twinge of bittersweet guilt. The garbage truck is on its way; soon, the original photos will be gone for good.

But based on experience, she’s confident enough in the restoral to let them go. The digitized shots are far more portable, more flexible, and more interesting than the paper copies ever were. She can make edits that would otherwise have been impossible—like tweaking the exposure level or even the location and brightness of the original scene’s light sources. Or altering the camera’s focal length, decades after the shot was taken; the app’s AI has used the source image to extrapolate exactly where each object sits in three-dimensional space.

Finally, Julie can even perform that “Zoom… enhance” magic promised by science fiction movies for decades. As she steps back into the kitchen, she grabs her tablet and plops down at the counter. Time to take a closer look at Aunt Susie’s unbelievable 70s curls. ■


Categories
apple tech

Fixing Portrait mode’s grossness

A few weeks ago, I bought an iPhone X. I love its face-unlock authentication and its gorgeously tall screen, but its dual-lens camera is easily my favorite feature. The iPhone X is the best camera I’ve every owned, and it’s not even really a competition. I’ve never had an SLR and my last point-and-shoot was from the early days of digital photography.

In fact, the iPhone X camera is so good (or, rather, “good enough”) that it’s hard to imagine I’ll ever consider buying a standalone camera again.

That’s not to say there isn’t plenty of room for improvement. In particular, I find “portrait mode” (the fake bokeh blur effect) alternately intoxicating and maddening. In many cases, it does a great job isolating my photo’s foreground subject. But when it fails, it fails hard. As many others have pointed out, hair poses a serious challenge to its algorithm, as do non-contiguous background areas (e.g. a piece of the background visible through the crook of your subject’s arm) and ambiguous edges.

Could Apple fix these sorts of hiccups in software? This is my first dance with Portrait mode, so I can’t say whether the feature has improved since its first release last year. But I have at least some hope that edge detection will improve and fill in some of the gaps (pun intended).

Even if the algorithms improve, I’d like to see some way to touch up these problematic Portrait mode depth maps. There are already several interesting apps that let me see the underlying depth channel. Slør paints it as a black-and-white alpha channel; Focos lets me spin the depth layers around like I’m in some sort of sci-fi flick (“Rotate. Zoom. Enhance.”).

But neither of those apps—nor any others that I’ve heard of—let you actually fix the depth sensing errors that creep into Portrait mode photos. Take those non-continugous background areas I mentioned earlier. Given a decent channel-editing interface, it would be relatively simple to paint a foreground area back into the background, where it belongs.

It’s possible that Apple’s developer-facing APIs won’t allow depth changes to be written back into a fully editable Portrait mode photo. If not, that’s a shame and ought to be corrected. In the meantime, though, I’d love to see an app handle depth edits “in-house”, then export a flattened photo (with the proper blur baked in) back to the camera roll.

Hopefully that sort of functionality arrives soon. Portrait mode is a blast, but it’s hard to feel too enthusiastic when it produces such glaring flaws. ■

Categories
tech

“Smart” homes are dumb.

UPDATE (11/16/2017): Steven Aquino provides a helpful reminder: smart home devices provide accessibility benefits that easily outweigh the finicky quibbles I raise below:


Smart devices are all the rage. You can buy an internet-connected version of nearly every home appliance and gadget: light switches that respond to your digital assistant’s commands. Shades that automatically open or close, depending on the weather forecast. Power outlets that switch on your lamp as you pull into the driveway. Light bulbs that match their color to what’s on your TV screen. Thermostats that lower the temperature when you leave the house. Deadbolts that unlock when your phone draws near.

Gadgets like these are fun; I’d love to play with some smart bulbs or a robot vacuum at some point down the road.

But beyond that? I don’t really understand why anyone would install a semi-permanent smart device in their house.

On the one hand, there’s the “faux-convenience” factor. With many smart home gadgets, you’re trading a device that’s simplistic but predictable for one that’s “advanced” but finicky. Consider: if a dumb light switch stops working, there’s a very limited number of things that could have gone wrong—basically, either the wiring came loose or the circuit breaker blew.

But with a smart light switch, you have those potential problems, plus many others. Maybe the device’s firmware is buggy. Maybe the manufacturer hasn’t updated their app for your new phone hardware. Maybe the smart home platform itself is half-baked. Maybe the trigger service (e.g. IFTTT) is offline. Perhaps the automation you programmed failed to anticipate the fall time change. The list of potential troubleshooting steps goes on and on.

You may be just nerdy enough to enjoy debugging your house. More power to you. But do the other residents of your home feel the same way? Chances are, your roommates, significant other, guests, or kids would prefer that things just work. What happens when you’re away for the night and your spouse can’t turn on the lights? You’ve basically extended the problem of over-complicated home theater set-ups to your entire house.

And what happens when you try to sell your “smart home”? Most buyers won’t be interested in inheriting your complex network of domestic devices. They may not share your penchant for tinkering, and they may view your smart gadgets as a maintenance nightmare, rather than an automation dream.

Even if you plan to stay in your current house forever, you face the problem of longevity. It’s not unusual for “dumb” devices to last for decades—even generations. When’s the last time a manual light switch or doorknob in your house just stopped working?

A smart device is a ticking time bomb. It’s only a matter of time until the a) heat, dust and age render the unit inoperable or b) the device is deprecated by the manufacturer or the home hub vendor. An innocuous-looking app update could render your light switches inoperable. By installing smart devices, you’ve condemned yourself to upgrading the unit every decade (if not more frequently).


Computer miniaturization has led to remarkable quality-of-life improvements. A smart phone is infinitely more capable than the spiral-corded kitchen handset I grew up with. But that doesn’t mean that every device in my house deserves its own CPU. For basic home operations, rock-solid reliability is the only feature that matters. [Edit: that may be true for me, but not for everyone. See preface above.] Give me basic functionality over whiz-bang capability—at least when it comes to flicking on the lights. ■

Categories
tech

Never not connected: tracking gadget usage all day (and all night)

Last week, I compared my own media habits to those of the “average American.” As I tallied the hours, I was struck by how much of my waking life is spent using gadgets. Here’s my average weekday, with the devices italicized:

  • 4 AM: alarm goes off. I immediately reach for my iPhone. I spend 30–45 minutes (sometimes as long as an hour) catching up on Twitter via Tweetbot.
  • 5 AM: morning restroom visit; I typically weigh myself and record the result using Vekt on my Apple Watch.
  • 5:05 AM: meditation practice; I time my mindfulness sessions using Headspace or Insight Timer on my iPhone.
  • 5:30 AM: writing. I typically draft my posts in Sublime Text 3 on my HP ZBook Studio laptop, then queue up each article in WordPress.
  • 6:30 AM: exercising (if writing doesn’t consume the extra time). I track my workouts on the Apple Watch, and I usually listen to podcasts using Overcast on my iPhone (since podcasts on the Apple Watch are a no-go).
  • 7:30 AM: shower and prep for work. This is one of the few gadgetless reprieves in my day, although I have taken to wearing my water-resistant Apple Watch in the shower lately. It’s helpful to know how long I have before I need to punch the clock.
  • 8:00 AM: workday. I work in communications for a commercial real estate firm). My typical day at the office involves writing, light graphics editing, and layout, all of which keep me tied to my HP laptop. I dock the unit and connect it to the three external Dell monitors that ring my makeshift treadmill desk.
  • 12:00 PM: lunch. I often listen to podcasts on my iPhone while I cook, then browse Tweetbot as I eat. When I can squeeze it in, I’ll record the daily Careful Tech podcast using my laptop during this lunch break, too.
  • 1:00 PM: work, round two. More laptop use.
  • 5:00 PM: dinner prep and family time. This is the only stretch of the day when I truly set aside my gadgets. We may play some music on the Amazon Echo in the kitchen (my daughter is currently enamored with the song “Monster Mash”) or snap a few photos. For the most part, though, we aim to be present to each other during these pre-bedtime hours.
  • 7:00 PM: with our toddler in bed, my wife and I collapse in front of our TCL Roku TV to enjoy an episode or two of our favorite shows. Programs we’ve recently binge-watched include Stranger Things, The Great British Bake-off, Silicon Valley, and Star Trek: Discovery. When one of us is away for the evening, we have our own personal favorites (I’ve recently gotten into Halt and Catch Fire). Regardless of what’s on the TV, our primary attention is directed to our phones; my wife gravitates to Instagram and Facebook; I prefer Twitter.
  • 8:30 PM: evening bathroom routine. Yes, I often brush my teeth while scrolling through Tweetbot on my phone. Honestly, the main reason I hate flossing is that I need both hands to do it—and that means I have to set the phone down.
  • 9:00 PM: bedtime. It only takes a few minutes of Twitter-browsing on the iPhone in bed before I start to nod off.
  • 9:15 PM: sleep. I’ve taken to wearing my Apple Watch at night for sleep-tracking purposes. Autosleep uses the wearable device’s accelerometer to record how much rest I get each night. I don’t use this data for much of anything, but it’s fun to track.

In summary, I spend my entire day (and night!) using one device or another in one way or another.

That realization is sobering. So much of my life is tied to gadgets! In particular, I’m troubled by the fact that Twitter has become my default way to kill time. It’s the first thing I do when I wake up. It’s the last thing I do before falling asleep. I turn to it at the slightest sign of boredom. At least some of that time could be better spent—even if it just meant I was more present with my own thoughts.

On the other hand, just because my entire day involves gadgets doesn’t mean it revolves around gadgets. We use these devices for everything now; they can empower intentional, productive activity just as much as they can enable pointless or self-destructive behavior. For example, my (iPhone-led) meditation sessions are certainly beneficial, as is the sleep- and exercise-tracking made possible by my Apple Watch. And I don’t feel guilty that my work life requires constant connectivity; that’s the norm for most knowledge workers these days. ■

Categories
Culture tech

I’m a bad Uber rider

I’m traveling for work this week, and yesterday, I needed a ride from the airport to a hotel near a client campus. Enter Uber, a service I hadn’t used since January.

I’d like to claim that I deleted the app after Uber’s misogynistic “bro” culture was exposed earlier this year. But my long hiatus was really just due to the fact that I can’t use it where I live (in rural West Virginia). And when I found myself needing a lift yesterday, Uber was the path of least resistance. I didn’t have Lyft installed, and I didn’t want to bother tracking down a reputable taxi service.

So, although I felt a twinge of guilt, I requested an Uber ride. Less than two minutes after tapping the button, a silver Prius rolled up. I heaved my suitcase into its trunk and slid into the backseat. Off we went.

After a few minutes of small talk, the driver asked a question that took me by surprise. “Man, why’s your Uber rating so low? You seem fine.”

I was confused. “‘So low’?” I repeated. “What do you mean?” This was only my tenth Uber trip—ever. How could my rating be low? The driver explained that my score was 4.33 (out of a possible 5). That didn’t seem too bad; wasn’t that a solid B? I said as much, and the driver shook his head ruefully. “Nah, man. That’s a really bad rating.”

Doubt and cynicism set in. Was this some headgame that drivers play to pry tips out of their passengers? As we continued to chat, I whipped out my phone and checked Google: “What is a low Uber passenger rating?” It turns out, my driver was spot on. Any passenger score below 4.5 is a red flag in the Uber world.

My skepticism was replaced with anxiety. Why did past Uber drivers rate me so low? I had thought I was a model Uber citizen; I had never forced a driver to wait. I don’t drink, so riding while hammered wasn’t a worry. I had never slammed a door or tracked in mud. Why didn’t they like me? Was I oblivious to my own obnoxiousness? Did I have a subpar personality?

Maybe. But, after more Googling and reflection, I think my low score reflects my fundamental misunderstanding of the service. I didn’t even realize I had a passenger rating. I thought that Uber was a one-way street, like any customer-to-merchant business. I had failed to grasp that Uber is a network, not a service. It’s not a company that I pay to deliver me across town. It’s a connection concierge; I pay Uber to connect me to drivers. That’s a subtle distinction, but failure to grasp it led to some embarrassing mistakes:

For example, before yesterday, I had never tipped an Uber driver. That may make me sound like cheap jerk; forgive my n00b status. I didn’t even know that Uber tipping was a thing. When I last used the service, tips hadn’t been added to the app. I assumed that one major appeal of the ride-sharing revolution was that Uber discouraged gratuities, instead bundling my tip into its fare. This cashless economy is great, I thought. Who wants to carry a wad of small bills in his pocket, anyways?

I was wrong. Uber drivers expect tips. It seems likely that I earned a one-star review or two by walking away without slipping my driver a fiver.

A second mistake I made as an Uber rookie? I underrated my drivers. On several past trips, I rated the ride like a restaurant on Yelp—i.e., honestly. If the car smelled gross or the driver was too chatty, I withheld a star or two. In fact, most of the ratings I had given before yesterday were less than five stars. That seemed fair; I mean, 4/5 is still pretty good, right?

Wrong. Drivers prize their Uber rating, and scores under five are reserved for disastrous service. Like it or not, grade inflation has spilled over into the gig economy; everybody wants an A.

It’s at least possible that my past drivers noticed when I gave them less than five stars; they might have marked me down in revenge.


Whatever the reason for my subpar rating, I’m now uber-conscious of how drivers see me, and I’m determined to do everything I can to earn a perfect score on each drive. My new philosophy is simple: drivers get five stars if I don’t die en route. And I plan to tip everyone, regardless of the level of service I receive.

Yes, this effectively makes Uber’s review system meaningless. The Prius yesterday reeked of cheap cologne and boasted mysterious stains (what causes an inch-wide white blotch on a headrest? Do I want to know?). But I’m too terrified of getting down-rated to make my displeasure known.

On yesterday’s trip, at least, my newly-enlightened dishonesty worked. A few minutes after checking into my hotel, I checked Uber again; my score had ticked up to a more-respectable 4.4. I’m officially a little less of a jerk. ■

Categories
Culture tech

Inflight entertainment and technophobia

Inflight entertainment was once a lifeline on commercial jets. Drop-down TVs made all the difference between a bearable multihour flight and an absolute hellslog. Even Will & Grace reruns, played back on a tiny, faded CRT three rows away, were a welcome distraction from the cramped discomfort of the average domestic flight.

The rise of mobile personal devices has changed things. Once, a personal LCD viewscreen with live satellite TV would have seemed like an unimaginable luxury. These days, I switch off that headrest screen without a second thought; I’d rather watch content I like, downloaded onto my own devices.

The airlines have noticed this change—more and more passengers ignoring the cabin-wide entertainment—and they’re updating their planes in response. Why bother with expensive entertainment hardware if people won’t use it. Some jets have even ditched the screens altogether, moving to onboard wifi as a means of distributing movies to customers’ personal devices.

That’s all well and good for digital natives, who can jump through the requisite hoops. I love having a library of recent releases to stream. But the move away from shared entertainment on flights isn’t as welcome for those who struggle with tech—or who don’t have their own smartphone or tablet.

My mother is a good example. As the airlines have shifted away from built-in screens, she’s left without anything to watch. She’s not familiar enough with her cheap Android smartphone to connect to the inflight streaming library. Consider the dance required: download the airline’s app before boarding, enable airplane mode, re-enable wifi, open the settings app, connect to the network, etc., etc. That’s a familiar dance for the young and nerdy; for her, it’s an insurmountable wall. She resigns herself to boredom, sitting quietly through interminable transcontinental flights.

The airlines ought to accommodate edge cases like my mom’s. A little tech support could go a long way on planes without in-cabin screens. Why not invite tech-averse passengers to press the call button to receive help navigating their devices? The steward staff would receive baseline training for Android and iOS—just enough to help get customers connected.

Maybe that’s an unreasonable added burden for an already-overworked inflight staff. And maybe there are too many technophobes onboard the average flight to offer that sort of concierge-level hand-holding.

If so, the airlines probably shouldn’t have removed the shared screens in the first place. ■

Categories
Culture tech

Tribal malfunction (rooting for tech companies is silly)

Humans are instinctively tribal. Our fierce, hard-wired clan loyalty has its advantages; in the prehistoric age of hunter-gatherers, tribal commitment could make the difference between surviving together or dying alone.

That same tribal instinct drives our social behavior today, too. We’re driven by irrational devotion to sports franchises, political parties, and, yes, multinational technology companies.

In the last case, we’re bound to our “team” not by geography, ideology, or genetics, but by past purchases. Once we decide to invest thousands of dollars in one platform over another, we feel tremendous pressure to see that decision justified, to see “our side” come out on top. Hence, we see Apple hordes descending upon tech sites that don’t give Cupertino the credit it deserves.

Such brand affinity is a malfunction of our tribal programming, and it works against our own best interests. Google and Amazon will never return my allegiance, and their success is largely irrelevant to my own happiness. So why should I bother defending them, or deriding their competitors?

If anything, we should root against any one company—even our “favorite”—from dominating the market. Apple customers should celebrate the successes of Google, Samsung, Microsoft, and Amazon at least as enthusiastically as Apple’s own victories. We need viable ecosystems and trend-setting products outside of iOS; competition is good for the industry, good for consumers, and good for Apple.

So, when Google debuts a phone like the Pixel 2, the logical response from Apple fans should be “That camera is incredible!”, not “Neener, neener! Apple was right about the headphone jack!” When Apple announces another record-shattering quarter of profits, Android afficianados should cheer, instead of prattling on about “sheeple” buying whatever Jony says is good.

Let’s leave the tech cheerleading to those on these companies’ payrolls. Let’s step back from the arena and let the tech giants duke it out themselves. And let’s look forward to the innovation ahead, no matter whether it comes from Mountain View, Cupertino, Seattle, or Redmond. ■


  1. Foam finger artwork courtesy of Vecteezy.