Thoughts on ‘Hamilton’ and theatergoing in the age of the smartphone

Last week, my wife and I had the opportunity to see *Hamilton* live in Pittsburgh.

We had decent enough seats—ten or so rows back in the left-hand orchestra section. While the view was partially obscured (we couldn’t see the elevated balcony at stage right), we were close enough to see the actors’ facial expressions clearly.

We had an amazing time. There’s a reason that Hamilton is a worldwide phenomenon; it’s a remarkable work of art. The show is cleverly self-referential—reprising leitmotifs and coyly paying off its dramatic promises. It’s playfully historical—grounding itself in real events but freely reinterpreting them, too. And it’s strikingly original—showcasing a genre foreign to Broadway, while also paying homage to musical theater’s long history.

However, my biggest takeaway from the show had nothing to do with the onstage performance. I was more fascinated by what happened in the theater during intermission. While Emily ran to the restroom, I sat and watched the crowd.

Here’s the thing: everyone was on their phone. I mean, literally 90% of the audience spent the intermission either staring at their smartphone or cradling it in-hand. There were very few exceptions: the very old (some of whom may prefer not to own a phone) and the very young (i.e., kids who probably can’t wait for their first hand-me-down device).

We’ve gone through an incredible societal transformation in just a decade. Twelve years ago, a Broadway intermission would have felt very different. Sure, a few people might have made a phone call on their flip phone, but nobody could’ve pulled an addictive “everything” device out of their pocket.

What did the 2006 audience do during those twenty-minute breaks? Doubtless, many would’ve buried their noses in the program—perusing the cast bios or the second act’s song list. But many attendees would’ve chatted up a neighbor and reflected together on the show. The hall’s decibel level might’ve been significantly louder—many more voices adding to the cacophony (rather than silenced in rapt attention to their phones). ◾

Putting screen time to better use

More than three-quarters of all Americans own a smartphone. In 2018 those 253 million Americans spent $1,380 and 1,460 hours on their smartphone and other mobile devices. That’s 91 waking days; cumulatively, that adds up to 370 billion waking American hours and $349 billion.

In 2019, here’s what we could do instead.

Paul Greenberg conducts a thought experiment: what would happen if Americans reallocated the time and money spent on smartphones into more productive activities?

This article is a little silly. It shocks us by quantifying our excessive device time, but it ignores inconvenient questions. For example…

  1. What’s included in “other mobile devices”? Tablet computers fall into that category, presumably. What about e-readers? Laptops? My point: some “mobile devices” can be used for more productive and beneficial activities than others (e.g. reading, work).
  2. Why assume that all smartphone use is bad? Even if we were talking about smartphones on their own, it’s unfair to pretend that phones can’t also be used to read or pursue social justice.
  3. Why single out the smartphone? It’s not as if laziness or self-absorption didn’t exist before the iPhone’s release in 2007. Time wasted on our phones in 2018 would’ve been wasted on TV in 1988 or on radio in 1948.

Still, Greenberg has a point. We claim that we’re “too busy” to launch a new project, read more, or exercise. But what about the time spent thumbing through Facebook, playing Clash of Clans, or binging on Netflix? If we could reclaim just one hour each day from mindless smartphone use—then apply it towards nobler ends—where might we be a year from now? ■

An antidote for smartphone “zombie syndrome”?

The mind is no computer, but our consciousness still merges with our phones and tablets as seamlessly as a painter’s hand fuses with her brush or musicians vocalize through their instruments. This fusion can happen, Buddhist teaching holds, because consciousness is formless and adopts the qualities of everything it “touches.” Once we’ve immersed ourselves in our screens, they become our whole reality—and that’s why texting drivers look up with surprise when they rear-end the car in front of them.

Zen Priest Kurt Spellmeyer, explaining why he never replaced his lost phone

For Spellmeyer, smartphones extend our minds—and this poses both an opportunity and a threat. Yes, our devices augment our mental capabilities, enhancing memory and accelerating calculations. But our phones also super-charge our penchant for self-distraction. As he explains,

The nonstop novelty prevents us from uncovering the sources of our suffering. We shuttle from one screen to the next, trying to allay our nagging sense that something’s missing or not right.

If you’re anything like me, you’ve frittered away entire afternoons, mindlessly refreshing Twitter or dipping glumly into app after app. Even though you never quite feel satisfied, you keep thumbing around, semi-consciously. Spellmeyer claims that meditation can quell our appetite for distraction and prevent “screen zombie” syndrome. 

For me, meditation hasn’t totally sapped screens of their allure. I still frequently drift between apps on autopilot. But I have noticed one difference: I’m more aware of losing myself, in the moment. Questions arise, like “Is this making me happier?” and “Will I regret this, later today?”

Sometimes, that’s just enough to interrupt the cycle, and I manage to set the phone down. ■

AI: from chess computer to… god?

Suppose that deeper patterns exist to be discovered — in the ways genes are regulated or cancer progresses; in the orchestration of the immune system; in the dance of subatomic particles. And suppose that these patterns can be predicted, but only by an intelligence far superior to ours. If AlphaInfinity could identify and understand them, it would seem to us like an oracle.

We would sit at its feet and listen intently. We would not understand why the oracle was always right, but we could check its calculations and predictions against experiments and observations, and confirm its revelations. Science, that signal human endeavor, would reduce our role to that of spectators, gaping in wonder and confusion.

Maybe eventually our lack of insight would no longer bother us. After all, AlphaInfinity could cure all our diseases, solve all our scientific problems and make all our other intellectual trains run on time. We did pretty well without much insight for the first 300,000 years or so of our existence as Homo sapiens. And we’ll have no shortage of memory: we will recall with pride the golden era of human insight, this glorious interlude, a few thousand years long, between our uncomprehending past and our incomprehensible future.

Steven Strogatz, writing in the New York Times

At first, AI as “oracle” seems silly. That term has religious overtones, and we typically apply it to mystics and gurus—not to computers.

But if humans really are hard-wired to worship, wouldn’t we instinctively revere an unexplainably accurate future-prediction machine? It’s not a huge leap from “wonder and confusion” to awe and devotion. ◾