Calendrical Musings

The New Year has me reflecting upon the calendar that most of the world uses, the Gregorian Calendar. I thought I would re-post this essay  from StratFor, that does a very good job of examining its history and perhaps an idea or two for something different.

When England adopted the Gregorian calendar in 1752, some 170 years after it was introduced by Pope Gregory XIII, Benjamin Franklin wrote, “It is pleasant for an old man to be able to go to bed on Sept. 2, and not have to get up until Sept. 14.” Indeed, nearly two weeks evaporated into thin air in England when it transitioned from the Julian calendar, which had left the country 11 days behind much of Europe. Such calendrical acrobatics are not unusual. The year 46 B.C., a year before Julius Caesar implemented his namesake system, lasted 445 days and later became known as the “final year of confusion.”

In other words, the systems used by mankind to track, organize and manipulate time have often been arbitrary, uneven and disruptive, especially when designed poorly or foisted upon an unwilling society. The history of calendrical reform has been shaped by the egos of emperors, disputes among churches, the insights of astronomers and mathematicians, and immutable geopolitical realities. Attempts at improvements have sparked political turmoil and commercial chaos, and seemingly rational changes have consistently failed to take root.

Today, as we enter the 432nd year guided by the Gregorian calendar, reform advocates argue that the calendar’s peculiarities and inaccuracies continue to do widespread damage each year. They say the current system unnecessarily subjects businesses to numerous calendar-generated financial complications, confusion and reporting inconsistencies.

In years where Christmas and New Year’s Day each fall on a weekday, for example, economic productivity is essentially paralyzed for the better part of two weeks, and one British study found that moving a handful of national holidays to the weekend would boost the United Kingdom’s gross domestic product by around 1 percent.

The Gregorian calendar’s shortcomings are magnified by the fact that multiple improvements have been formulated, proposed to the public and then largely ignored over the years — most recently in 2012, with the unveiling of a highly rational streamlined calendar that addresses many of the Gregorian calendar’s problems. According to the calendar’s creators, it would generate more than $100 billion each year worldwide and “break the grip of the world-wide consensus that embraces a second-rate calendar imposed by a Pope over 400 years ago.” This attempt, like many of the others, has received some media attention but has thus far failed to gain any meaningful traction with policymakers or the wider public.

Myriad geopolitical elements and obstacles are embedded in the issue of calendar reform, from the powerful historical role of empires and ecclesiastical authorities to the unifying forces of commerce and the divisive nature of sovereignty and state interests. Indeed, geopolitical themes are present both in the creation of the Gregorian calendar and its permanence, and its ascendance and enduring primacy tells us much about the nature of the international system.

How We Got Here

At its core, the modern calendar is an attempt to track and predict the relationship between the sun and various regions of the earth. Historically, agricultural cycles, local climates, latitudes, tidal ebbs and flows and imperatives such as the need to anticipate seasonal change have shaped calendars. The Egyptian calendar, for example, was established in part to predict the annual rising of the Nile River, which was critical to Egyptian agriculture. This motivation is also why lunar calendars similar to the ones still used by Muslims fell out of favor somewhat — with 12 lunar cycles adding up to roughly 354 days, such systems quickly drift out of alignment with the seasons.

The Gregorian calendar, introduced by Pope Gregory XIII in 1582, was itself an attempt to address the problems of its predecessor, the Julian calendar, which had been introduced by Julius Caesar to abolish the use of the lunar year and eliminate a three-month gap that opened up between the civil and astronomical equinoxes. It subsequently spread throughout the Roman Empire (and beyond as Christianity spread) and influenced the design of calendars elsewhere. Though it deviates from the time it takes the earth to revolve around the sun by just 11 minutes (a remarkable astronomical feat for the time), the Julian system overly adjusted for the fractional difference in year length, slowly leading to a misalignment in the astronomical and calendar years. For the Catholic Church, this meant that Easter — traditionally tied to the spring equinox — would eventually drift into another season altogether.

By dropping 10 days to get seasons back on track and by eliminating the Julian calendar’s excess leap years, the Gregorian calendar came closer to reflecting the exact length of an astronomical year (roughly 365.24 days) — it is only off by 26 seconds annually, culminating in a full day’s difference every 3,323 years.

But what was perhaps most significant about Pope Gregory’s system was not its changes, but rather its role in the onset of the globalized era. In centuries prior, countries around the world had used a disjointed array of uncoordinated calendars, each adopted for local purposes and based primarily on local geographical factors. The Mayan calendar would not be easily aligned with the Egyptian, Greek, Chinese or Julian calendars, and so forth. In addition to the pope’s far-reaching influence, the adoption of the Gregorian system was facilitated by the emergence of a globalized system marked by exploration and the development of long-distance trade networks and inter-connectors between regions beginning in the late 1400’s. The pope’s calendar was essentially the imposition of a true global interactive system and the acknowledgment of a new global reality.

Despite its improvements, the Gregorian calendar preserved several of the Julian calendar’s quirks. Months still varied in length, and holidays still fell on different days of the week from year to year. In fact, its benefits over the Julian calendar are disputed among astronomers.

Nonetheless, its widespread adoption and use in trade and communication played a fundamental role in the development and growth of the modern international system.

Implementation Problems

From the start, however, the Gregorian calendar faced resistance from several corners, and implementation was slow and uneven. The edict issued by Pope Gregory XIII carried no legal weight beyond the Papal States, so the adoption of his calendar for civil purposes necessitated implementation by individual governments.

Though Catholic countries like Spain and Portugal adopted the new system quickly, many Protestant and Eastern Orthodox countries saw the Gregorian calendar as an attempt to bring them under the Catholic sphere of influence. These states, including Germany and England, refused to adopt the new calendar for a number of years, though most eventually warmed to it for purposes of convenience in international trade. Russia only adopted it in 1918 after the Russian Revolution in 1917 (the Russian Orthodox Church still uses the Julian calendar), and Greece, the last European nation to adopt the Gregorian calendar for civil purposes, did not do so until 1923.

In 1793, following the French Revolution, the new republic replaced the Gregorian calendar with the French Republican calendar, commonly called the French Revolutionary calendar, as part of an attempt to purge the country of any remnants of regime (and by association, Catholic) influence. Due to a number of issues, including the calendar’s inconsistent starting date each year, 10-day workweeks and incompatibility with secularly based trade events, the new calendar lasted only around 12 years before France reverted to the Gregorian version.

Some 170 years later, the Shah of Iran attempted a similar experiment amid a competition with the country’s religious leaders for political influence. As part of a larger bid to shift power away from the clergy, the shah in 1976 replaced the country’s Islamic calendar with the secular Imperial calendar — a move viewed by many as anti-Islamic — spurring opposition to the shah and his policies. After the shah was overthrown in 1979, his successor restored the Islamic calendar to placate protesters and to reach a compromise with Iran’s religious leadership. 

Several countries — Afghanistan, Saudi Arabia and Iran among them — still have not officially adopted the Gregorian calendar. India, Bangladesh, Israel, Myanmar, and a few other countries use various calendars alongside the Gregorian system, and still others use a modified version of the Gregorian calendar, including Sri Lanka, Cambodia, Thailand, Japan, North Korea and China. For agricultural reasons, it is still practical in many places to maintain a parallel local calendar based on agricultural seasons rather than relying solely on a universal system based on arbitrary demarcations or seasons and features elsewhere on the planet (several members of the British Commonwealth for example, still mark the turn of the seasons on the first of December, March, June, and September, rather than the equinox and solstice points observed in the United States). In most such countries, however, use of the Gregorian calendar among businesses and others engaged in the international system is widespread.

Better Systems?

Today, the Gregorian calendar’s shortcomings have translated into substantial losses in productivity for businesses in the form of extra federal vacation days for employees, business quarters of different sizes and imperfect year-on-year fiscal comparisons. The lack of consistency across each calendar year has also created difficulties in financial forecasting for many companies.

Dozens of attempts have been made over the years to improve the remaining inefficiencies in Pope Gregory’s calendar, all boasting different benefits. The Raventos Symmetrical Perpetual and Colligan’s Pax calendars feature 13 months of 28 days, while the Symmetry 454 Calendar eliminates the possibility of having the 13th day of any month fall on a Friday. In 1928, Eastman Kodak founder George Eastman introduced a more business-friendly calendar (the International Fixed calendar) within his company that was the same from year to year and allowed numerical days of each month to fall on the same weekday — for example, the 15th of each month was always a Sunday. This setup had the advantage of facilitating business activities such as scheduling regular meetings and more accurately comparing monthly statistics.

Reform attempts have not been confined to hobbyists, advocates and academics. In 1954, the U.N. took up the question of calendar reform at the request of India, which argued that the Gregorian calendar creates an inadequate system for economic and business-related activities. Among the listed grievances were quarters and half years of unequal size, which make business calculations and forecasts difficult; inconsistency in the occurrence of specific days, which has the potential of interfering with recurring business and governmental meetings; and the variance in weekday composition across any given month or year, which significantly impairs comparisons of trade volume since transactions typically fluctuate throughout the week.

In 2012, Richard Conn Henry, a former NASA astrophysicist, teamed up with his colleague, an applied economist named Steve H. Hanke, to introduce perhaps the most workable attempt at calendrical reform to date. The Hanke-Henry Permanent Calendar (itself an adaptation of a calendar introduced in 1996 by Bob McClenon) is, as the pair wrote for the Cato Institute in 2012, “religiously unobjectionable, business-friendly and identical year-to-year.”

The Hanke-Henry calendar would provide a fixed 364-day year with business quarters of equal length, eliminating many of the financial problems posed by its Gregorian counterpart. Calculations of interest, for example, often rely on estimates that use a 30-day month (or a 360-day year) for the sake of convenience, rather than the actual number of days, resulting in inaccuracies that — if fixed by the Hanke-Henry calendar, its creators say — would save up to an estimated $130 billion per year worldwide. (Similar problems would still arise for the years given an extra week in the Hanke-Henry system.)

Meanwhile, it would preserve the seven-day week cycle and in turn, the religious tradition of observing the Sabbath — the obstacle blocking many previous proposals’ path to success. As many as eight federal holidays would also consistently fall on weekends; while this probably would not be popular with employees, the calendar’s authors argue that it could save the United States as much as $150 billion per year (though it is difficult to anticipate how companies and workers would respond to the elimination of so many holidays, casting doubt upon such figures).

Obstacles to Reform and a Path Forward

Most reform proposals have failed to supplant the Gregorian system not because they failed to improve upon the status quo altogether, but because they either do not preserve the Sabbath, they disrupt the seven-day week (only a five-day week would fit neatly into a 365-day calendar without necessitating leap weeks or years) or they stray from the seasonal cycle. And the possibilities of calendrical reform highlight the difficulty of worldwide cooperation in the modern international system. Global collaboration would indeed be critical, since reform in certain places but not in others would cause more chaos and inefficiency than already exist in the current system. A tightly coordinated, carefully managed transition period would be critical to avoid many of the issues that occurred when the Gregorian calendar was adopted.

Today, in a more deeply interconnected, state-dominated system that lacks the singularly powerful voices of emperors or ecclesiastical authorities, who or what could compel such cooperation? Financial statistics and abstract notions of global efficiency are not nearly as unifying or animating as religious edicts, moral outrage or perceived threats. Theoretically, the benefits of a more rational calendar could lead to the emergence of a robust coalition of multinational interests advocating for a more efficient alternative, and successes such as the steady and continuous adoption of the metric system across the world highlight how efficiency-improving ideas can gain widespread adoption.

But international cooperation and coordination have remained elusive in far more pressing and less potentially disruptive issues. Absent more urgent and mutually beneficial incentives to change the system and a solution that appeals to a vast majority of people, global leaders will likely not be compelled to undertake the challenge of navigating what would inevitably be a disruptive and risky transition to an ostensibly more efficient alternative.

Any number of factors could generate resistance to change. If the benefits of a new calendar were unevenly distributed across countries — or if key powers would in any way be harmed by the change — any hope for a comprehensive global agreement would quickly collapse. Societies have long adjusted to the inefficiencies of the Gregorian system, and it would be reasonable to expect some level of resistance to attempts to disrupt a convention woven so deeply into the fabric of everyday life — especially if, say, the change disrupted cherished traditions or eliminated certain birthdays or holidays. Particularly in societies already suspicious of Western influence and power, attempts to implement something like the Hanke-Henry Permanent Calendar may once again spark considerable political opposition.

Even if a consensus among world leaders emerged in favor of reform, the details of the new system likely would still be vulnerable to the various interests, constraints and political whims of individual states. In the United States, for example, candy makers hoping to extend daylight trick-or-treating hours on Halloween lobbied extensively for the move of daylight saving time to November. According to legend, in the Julian calendar, February was given just 28 days in order to lengthen August and satisfy Augustus Caesar’s vanity by making his namesake month as long as Julius Caesar’s July. The real story likely has more to do with issues related to numerology, ancient traditions or the haphazard evolution of an earlier Roman lunar calendar that only covered from around March to December. Regardless of what exactly led to February’s curious composition, its diminutive design reinforces the complicated nature of calendar adoption.

Such interference would not necessarily happen today, but it matters that it could. Policy is not made in a vacuum, and even the carefully calibrated Hanke-Henry calendar would not be immune to politics, narrow interests or caprice. Given the opportunity to bend such a reform to a state’s or leader’s needs — even if only to prolong a term in office, manipulate a statistic or prevent one’s birthday from always falling on a Tuesday — certain leaders could very well take it.

Nonetheless, a fundamental, worldwide change to something as long established as the calendar is not unthinkable, primarily because it has happened several times before. In other words, calendrical change is possible — it just tends to happen in fits and starts, lurching unevenly through history as each era refines, tinkers and adds its own contributions to make a better system. And if a global heavyweight with worldwide influence and leadership capabilities adopts the change, others may follow, even if not immediately.

 

Posted in General Musings, State of the Nation | Comments Off on Calendrical Musings

Snowflakes and How our Founding Fathers Got it Right

Members of the Snowflake Generation Reacting to Clinton’s Loss

As I read news accounts of police having to protect Electoral College members as they vote today to finalize Trump’s victory, I am shocked at the attempts by members of the snowflake generation to bully voters, to intimidate voters in ways that Fidel would have loved. Or Stalin, or Lenin. How wonderful. Our country is acting like a banana republic.

Hillary won the popular vote, however she did so by carrying California. The voters in California who voted for Trump had to accept the fact that winner takes all in the Golden State. This isn’t true in Maine, by the way, where Trump lost the state but still won one electoral vote. We might say that Trump voters in California were disenfranchised while those in Maine had their voices heard. We might even be outraged at this failure of notions of plurality.

But I digress.

The piece below appeared today at www.marketwatch.com and is perhaps the best analysis I have read thus far.

By James E. Campbell (his work, not mine)

Shocked and appalled by the prospect of a Donald Trump presidency, some supporters of Hillary Clinton have turned to minimizing and even delegitimizing Trump’s election. In an era of severe political polarization, in an election with two candidates seen from the outset in highly unfavorable terms, after the most brutal campaign in modern history, and with an outcome that astonished just about everyone, these reactions are understandable, but wrong.

Many die-hard Clinton supporters cannot bring themselves to believe their candidate could lose to Donald Trump. They think: How could such a crude and inept con man be elected president? Even after it has happened, it is unthinkable, a nightmare. So, the election must not have been fair.

Those on the fringe raise the specter of diabolical Russians hacking away at our democracy. More grounded Clintonians have less malevolent bogeymen — our Founding Fathers. As they see it, the election’s outcome should be blamed on a dysfunctional and archaic electoral-vote system. Hillary won the national popular vote. She should be president. It is as simple as that. The Electoral College should go the way of Trump University.

They are right about one thing: Clinton did win the popular vote, by some 2.8 million votes, as the most recent data show.

Yet Clinton has only 232 electoral votes (in 20 states plus Washington, D.C.) to Trump’s 306 (in 30 states plus one from Maine), making him the president-elect. So, Trump’s election without a popular-vote plurality is regarded as an injustice. Some Democrats claim a moral victory as victims of an electoral-vote system that once again horribly “misfired.” Their claim, however, neglects two facts.

First, had the election been conducted with rules awarding the presidency to the popular-vote winner, the candidates and many voters quite probably would have acted very differently, and the popular vote might not have been the same. Trump and Clinton would have campaigned in the “safe” states.

Potential voters in those states would have felt more pressure to turn out and to vote for “the lesser of two evils” and not to waste their votes on third-party candidates. Some additional Clinton voters would probably have shown up, but gains on the Trump side would probably have been larger as more reluctant Republicans would have been pushed to return to the fold, particularly in big blue states like California, New York and Illinois.

In short, a comparison of the national popular vote as cast and the electoral-vote division is no simple matter. This is particularly true in our age of pervasive polling in which people should have a good idea about whether they live in a state where their presidential vote might make a difference.

Second, Clinton’s popular-vote plurality over Trump depends on the votes in a single state: California, which single-handedly turns a Trump plurality into a Clinton plurality.

The electoral-vote system in 2016 (as in 2000, when George W. Bush became president despite losing the national popular vote) functioned as its defenders have long claimed. It prevented a single region (in this instance, a single state) from overruling the verdict of the more populous and diverse nation. Donald Trump’s election is difficult for many Americans (read: Snowflakes) to accept, but there is no good reason to question its democratic legitimacy.

For better or worse, Trump won the presidency by constitutional and sensible democratic rules that guided both campaigns and were known to any politically conscious citizen.

He also won the national popular vote cast outside of the single state of California. Moreover, Clinton won all of California’s 55 electoral votes despite the fact that 4.3 million of the state’s voters voted for Trump. That big winner-take-all advantage for California’s Democrats and Clinton was certainly felt, but it wasn’t enough to override her losses in many other states.

Under our electoral-vote system, American voters elected a national president, not California’s choice. It is in the nation’s interest for Democratic Party’s leaders and for Clinton voters to fully recognize the legitimacy of the election as they had urged Trump to do after the third presidential debate.

The Electoral College system worked as it should. It did not “misfire.” The election’s outcomes were ultimately about what Americans wanted and what they did not want — not about electoral mechanics.

James E. Campbell is a distinguished professor of political science at the University at Buffalo, and is the author of “Polarized: Making Sense of a Divided America”.

Posted in General Musings | Comments Off on Snowflakes and How our Founding Fathers Got it Right

Counterpoint to Carr

boo_nicholascarr_2226Nicholas Carr, in his seminal work, The Shallows: What the internet is doing to our brains, makes the point that reliance on the Internet and on hyper-text is effecting a change in the essential wiring of our brains. His argument, backed by several dozen studies and his own experience, is that by becoming a party to the so-called “sound bite” of information so artfully employed by television news cast, the Internet is reducing comprehension, retention, and ground-level understandings of things. We no longer think deeply. A quote from the jacket to the book sums it up nicely:

The core of education is this: Developing the capacity to concentrate. The fruits of this capacity we call civilization. But all that is finished, perhaps. Welcome to the shallows, where the un-educating of homo sapiens begins.  Carr does a wonderful job synthesizing the recent cognitive research. In doing so, he gently refutes the ideologists of progress and shows what is really at stake in the daily habits of our wired lives: The reconstitution of our minds. What emerges for the reader, inexorably, is the suspicion that we have well and truly screwed ourselves.”

Ultimately, the book is about the preservation of the human capacity for contemplation and wisdom.


Enter this piece, recently put up at www.techradar.com, wherein its author makes the point that perhaps Carr was only slight right, and that the Internet may actually makes us, the collective us, much smarter. I don’t agree but it is worth a review:

I’m supposed to be making sure I can use the term ‘down with the fourth wall’ in the context I want to. But within a few seconds I’ve found an article titled ‘25 Classic Moments When Movies Broke the Fourth Wall’ and within a few more I’m learning that The Big Short enlisted Anthony Bourdain and a metaphor around a seafood stew to explain what a collateralized debt obligation is.

gabriel-marcel-philosopher-quote-the-wise-man-knows-how-to-run-hisThat’s great. Within a few more seconds I’ve saved all of those things, and I can continue back on my original path, finding out more about the fourth wall usage.

If I were reading a book, it would be the same as using a contents page to search for the relevant chapter, finding a more compelling chapter title, skim-reading that for a few pages, and then returning to the task at hand.

This way of learning hasn’t changed all that much; yet there is a growing concern in contemporary neurological studies around what is changing. With vast plains of external memory now available, can the human brain afford to be purposefully forgetful?

The Google effect

Gone are the days when we needed to remember phone numbers, house numbers, birthdays or appointments. The details we need access to require a lot less context, and virtually no mapping out of information. Your brother’s house number, the other names of your girlfriend’s family in case they pick up the home phone, even your Nan’s birthday – forgetting any of these will no longer leave you stranded in a social quagmire (although you should probably try and remember that last one).

What worries me, frankly, is the air of idiocy that this perpetuates. And the disregard. You SHOULD remember your mother’s birthday. You SHOULD remember the other names of your girlfriend’s family. My God, are you THAT shallow? Please.

The reliance that we place on our brains to accurately recall information is in decline, while the bond we form with technology that can outsource this information strengthens.

Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips, a landmark 2011 report printed in the magazine Science by Betsy Sparrow, Jenny Liu and Daniel Wegner, found that “[The Web] is an external memory storage space, and we make it responsible for remembering things.”

In one of Sparrow’s studies, two groups of undergraduates were given trivia statements. One group were told they could retrieve the information later on their computer, while the other were told they wouldn’t be able to check back. The first group had worse recall than the second, suggesting that our brains are learning to disregard information found online.

This effect, as with the plasticity of brains, becomes stronger each time we experience it. The more we turn to Google for our answers, the less likely we are to retain what we find there. Instead of remembering a fact, Sparrow’s findings suggest that internet users have learned to remember how to find a fact.

Phone a friend

The good news is that our brains have never been that adept at remembering. Instead, we’ve historically used a technique called transactive memory, a term Sparrow’s peer Wegner proposed to signify the group mind (or hive mind, if you’re insufferable).

A transactive memory system is a mechanism through which groups collectively encode, store and retrieve knowledge. According to Wegner, a transactive memory system consists of the knowledge stored in each individual’s memory, but mixed in with metamemory containing information regarding a teammate’s domains of expertise. Effectively, you know what you know, and you know what kind of thing the others know.

There’s a reason you can’t stop thinking about consulting your phone during the pub quiz. A transactive memory system used to mean relying on your local community, whether that was your family or your pub quiz team. Now search engines, evernoteEvernote and smartphones are replacing personal contacts as our trusted external memory sources.

How many times have you been at a pub quiz aching to reach for your phone for a quick Google check? That’s your brain deciding that a search engine is a more reliable ‘phone a friend’ than your best mate.

It’s a subject that Nicholas Carr spends a lot of time on when deliberating over the future of the adult brain in his book The Shallows, published in 2010.  He argues that the internet is exposing the human brain to mind-altering technology, the levels of which haven’t been experienced since the printing press.

Our brains, according to Carr, are being rewired so they can only accommodate superficial understanding. Carr saw the ease of online searching and the distractions of browsing through the web as possibly limiting our capacity to concentrate.

However, with a few years’ breathing space from The Shallows it’s actually beginning to look like our brains, plasticity and all, deserve a bit more credit.

After all, they were never specifically ‘programmed’ to read, to make it into town for bang-on 11am, to write, or to use a printing press – we’ve just been able to adapt. In spite of Carr’s claims that we’re drifting into the shallows of comprehension, it’s likely that, inside our heads, something much more complex is happening.

The intelligence race

Contrary to the assertions Carr made in his 2008 article Is Google Making Us Stupid?, futurist Jamais Cascio argued in a 2009 issue of Atlantic Monthly that technology is actually making us smarter. Cascio made the case that the array of problems facing humanity will turn the fight for survival into an intelligence race:

“Most people don’t realize that this process is already under way,” he wrote. “In fact, it’s happening all around us, across the full spectrum of how we understand intelligence. It’s visible in the hive mind of the internet, in the powerful tools for simulation and visualization that are jump-starting new scientific disciplines, and in the development of drugs that some people (myself included) have discovered let them study harder, focus better, and stay awake longer with full clarity.”

A new generation of apps promise total recall for users.

app-googleIt’s now been five years since The Google Effect first made headlines. The idea that the internet can change how we think is no longer new or revolutionary. But the next generation of apps that aid these inventions with claims of total recall are still evolving – and you can bet our brains are too.

Total recall

Apps like Evernote, Slack and Trello free up the ridiculous amount of time it can take to track down pesky snippets of information by making it intuitive to find them. Since its inception in 2008, note-taking service Evernote has been busy building a base of more than 100 million users who are probably best described as converts, such is their commitment to the Evernote way of never forgetting.

Slack, founded in 2013, and Trello, in 2011, are often mentioned in the same breath, Slack standing for Searchable Log of All Conversation and Knowledge, and Trello being a web-based project management system. These apps can be seen as the direct descendants of Google searches – taking those initial instincts to bookmark or even (lest we forget) physically write down entire URLs in a separate notebook to return to later. Although each app serves different purposes, their end goal is the same: increase productivity and reduce time spent remembering.

The numbers are persuasive. Slack users claim that using the app results in 25 percent fewer meetings, a 48 percent reduction in internal emails and a 32 percent increase in productivity.

slack-horizontalJames Sherrett, Slack’s senior manager of accounts, said the changes Slack makes to its users’ memories are very specific to different individuals, but he added: “One of the things Slack users consistently tell us is that having Slack for all their team communication is like having a searchable, external super brain. They can trust that anything they need to know is in Slack – it’s all pulled together into one place and searchable.”

The days of learning by rote are over.

I asked Sherrett if he thinks there’s a risk of losing other skills when we don’t need to remember as often. He told me: “Rote memorization used to be a skill taught in schools and valued in adults. Being able to recite a speech from memory or cite a fact was considered a marker of intelligence.

“But that’s changed. We no longer teach rote memorization in schools. We teach ways to recall and search for information. This change has coincided with the increasing use of computers to store and manage information. And computers are very good at storing and managing information, and our brains are very flexible in how they adapt to get the information they need.”

There it is again – that emphasis on storing the memory away, instead of having it just to hand. I turned to Cascio to find out what he thinks of the rise in technology of Evernote, Slack, Trello et al, and whether they are making subtle or seismic changes to our memory.

Clippy for a new millenniumclippy

“The impact of these apps is an interesting dilemma, and one without an obvious ‘right’ answer,” Cascio said.

“Precision of memory is probably the greatest advantage these apps have over human ‘meat’ memory – our brains are notoriously bad at remembering the fiddly little details of things, and it’s commonly understood that when we remember something, our brains are functionally remembering the last time we remembered it, allowing errors to creep in. With an app, the accurate details are preserved.

“Our evolved biological brains still excel over digital memories: inference-based recall. It’s the ‘that reminds me of something…’ moment – the new recollection may not be directly related to what you were thinking about, but contains some non-obvious triggers for a link.

“This is all moving towards a world where recollection from a memory locker app wouldn’t require an active search. Imagine an app that pays attention to what you’re writing or saying and offers (in a non-obtrusive way) relevant items from your stored memories. Something like this would really serve as a brain co-processor, and not just a data dumpster.

“If it works, it would be like an extension of your own mind; if it doesn’t, it would be like Clippy with a power rating of over 9,000. ‘You appear to be engaged in a romantic encounter. Here’s a list of all of the times you’ve said the wrong thing during a make-out session.”

More room for the good stuff

Following the release of her study in 2011, Sparrow emphasized in interviews that “memory is so much more than memorization.” She argued that this shift away from memorizing may ultimately help people improve their comprehension and become better learners. The Google effect may allow us to free up more space on our internal hard drives, and focus on processing as opposed to memorizing.

Does outsourcing the banal details leave more room for the vital memories?

When we’re no longer expected to remember birthdays, postcodes, kings and queens, or even the century a country was founded in, are we saying goodbye to depth and fluidity? Are the days of serendipitous thoughts, malapropisms and parallel thinking over? Or does it leave our fervently human brains with more room for these very freedoms?

trello_icon_by_superblazingstar-d5hfswdCertainly, all these promises to Keep Track of Everything and Never Forget (claims made by Trello and Evernote respectively) can feel tempting. But for anyone who’s grown up straddling two centuries, there’s something inherently human in being forgetful. While memories of a time without smartphones or Wi-Fi grow fainter, there’s something wonderfully comforting in not remembering every detail.

Because, after all, we’re only supposed to recall what was truly memorable.

The thing is this: what is truly memorable is nearly limitless. We are told that we use perhaps 10% of our brains. Are we happy with utilization? I think not.

 

Posted in Counseling Concepts, General Musings | Comments Off on Counterpoint to Carr

Do what works: Theorize about it later

fidel-castro-smWith the death of Fidel Castro we are presented with an interesting opportunity to compare and contrast two rather authoritarian leaders and the results they engendered in their home countries. I won’t go into Castro’s successes – there were none – simply because the Cuba of 2016 is essentially the same as the day he took over. No, I’d rather take a quick look at the leader of Singapore, and what he did to take his country out of the shambles of post-colonialism and into the league of first world nations.

Lee Kuan Yew was the father of Modern Singapore. A politician of the highest order (read: A Leader) Yew was responsible for the development of Singapore into what became known as an Asian Tiger Economy, leveraging what he felt was the only natural resource they had – Singapore’s people.

Yes, he had a deep-water port, but the island nation had not enough fresh water. Yes, he lee_kuan_yewhad industrious people, but he also lacked most other natural resources. It was a conundrum of the highest order. To avoid being sucked into the larger neighboring country of Malaysia, which he at one time sought to join to ameliorate his country’s shortcomings, Yew essentially built a country out of scratch.

He eschewed populist policies. In a word, he led.

He favored pragmatic long-term social and economic measures. He adopted meritocracy and multi-racialism as governing principles. He made English the common language to integrate its immigrant society and to facilitate trade with the West. At the same time, he mandated bilingualism in the schools to preserve students’ mother tongue and ethnic identity. Yes, he curtailed civil liberties by banning public protests and asserting near-total media control, and even went so far as to bring libel suits against his opponents. His argument was that such disciplinary measures were necessary for political stability, which together with rule of law, were essential for economic progress. One cannot argue with the result: Singapore is a modern power-house. Its people are healthy, wealthy, and, because of the education system that Yew presided over, quite wise.775245

He had an approach that centered on one question: Does it work?  He didn’t throw away big ideas or theories, or even discount them per se. He simply asked that they (the idea, the solution, whatever) meet one simple, pragmatic standard: Does it work? It is the Git-R-Dun style of leadership that I hope our new president will take to heart.

Try it out the next time you study a philosophy, a value, an approach, a theory, an ideology…it doesn’t matter if the source was a great thinker of antiquity or your grandmother. Has it worked? Let us call this, “Lee Kuan Yew’s Rule,” to make it easy to remember.

He wrote:

My life is not guided by philosophy or theories. I get things done and leave others to extract the principles from my successful solutions. I do not work on a theory. Instead, I ask: what will make this work? If, after a series of solutions, I find that a certain approach worked, then I try to find out what was the principle behind the solution. So, Socrates, Plato, or Aristotle, I am not guided by them…I am interested in what works…Presented with the difficulty or major problem or an assortment of conflicting facts, I review what alternatives I have if my proposed solution does not work. I choose a solution which offers a higher probability of success, but if it fails, I have some other way. Never a dead end.

We were not ideologues. We did not believe in theories as such. A theory is an attractive proposition intellectually. What we faced was a real problem of human beings looking for work, to be paid, to buy their food, their clothes, their homes, and to bring their children up…I had read the theories and maybe half believed in them.

But we were sufficiently practical and pragmatic enough not to be cluttered up and inhibited by theories. If a thing works, let us work it, and that eventually evolved into the kind of economy that we have today. Our test was: does it work? Does it bring benefits to the people?…The prevailing theory then was that multinationals were exploiters of cheap labor and cheap raw materials and would suck a country dry…Nobody else wanted to exploit the labor. So why not, if they want to exploit our labor? They are welcome to it…. We were learning how to do a job from them, which we would never have learnt… We were part of the process that disproved the theory of the development economics school, that this was exploitation. We were in no position to be fussy about high-minded principles.

In Cuba, Castro was preeminently fussy about high-minded principles, most of which now lie in the dustbin of history’s stupid ideas. His people while literate, are abjectly poor and skilled in virtually nothing other than raising sugar cane.

12,000 miles to the west, in Singapore, Lew Kuan Yew took care of his people and allowed them to flourish. What else is a leader paid to do?

Posted in General Musings, State of the Nation | Comments Off on Do what works: Theorize about it later

Quit Social Media – PLEASE!

blinkeredThis article appeared in the New York Times on the 19th of November, 2016. It captures my sentiments about things like Facebook and LinkedIn almost perfectly. Therefore, I am reproducing the article here while claiming no ownership of it whatsoever.

 


Quit Social Media! Your Career May Depend on It.

From Preoccupations in the New York Times, November 19, 2016

By Cal Newport

I’m a millennial computer scientist who also writes books and runs a blog. Demographically speaking I should be a heavy social media user, but that is not the case. I’ve never had a social media account.

At the moment, this makes me an outlier, but I think many more people should follow my lead and quit these services. There are many issues with social media, from its corrosion of civic life to its cultural shallowness, but the argument I want to make here is more pragmatic: You should quit social media because it can hurt your career.

This claim, of course, runs counter to our current understanding of social media’s role in the professional sphere. We’ve been told that it’s important to tend to your so-called social media brand, as this provides you access to opportunities you might otherwise miss and supports the diverse contact network you need to get ahead. Many people in my generation fear that without a social media presence, they would be invisible to the job market.

In a recent New York magazine essay, Andrew Sullivan recalled when he started to feel obligated to update his blog every half-hour or so. It seemed as if everyone with a Facebook account and a smartphone now felt pressured to run their own high-stress, one-person media operation, and “the once-unimaginable pace of the professional blogger was now the default for everyone,” he wrote.

I think this behavior is misguided. In a capitalist economy, the market rewards things that are rare and valuable. Social media use is decidedly not rare or valuable. Any 16-year-old with a smartphone can invent a hashtag or re-post a viral article. The idea that if you engage in enough of this low-value activity, it will somehow add up to something of high value in your career is the same dubious alchemy that forms the core of most snake oil and flimflam in business.

Professional success is hard, but it’s not complicated. The foundation to achievement and fulfillment, almost without exception, requires that you hone a useful craft and then applysteve-martin-success it to things that people care about. This is a philosophy perhaps best summarized by the advice Steve Martin used to give aspiring entertainers: “Be so good they can’t ignore you.” If you do that, the rest will work itself out, regardless of the size of your Instagram following.

A common response to my social media skepticism is the idea that using these services “can’t hurt.” In addition to honing skills and producing things that are valuable, my critics note, why not also expose yourself to the opportunities and connections that social media can generate? I have two objections to this line of thinking.

First, interesting opportunities and useful connections are not as scarce as social media proponents claim. In my own professional life, for example, as I improved my standing as an academic and a writer, I began receiving more interesting opportunities than I could handle. I currently have filters on my website aimed at reducing, not increasing, the number of offers and introductions I receive.

My research on successful professionals underscores that this experience is common: As you become more valuable to the marketplace, good things will find you. To be clear, I’m not arguing that new opportunities and connections are unimportant. I’m instead arguing that you don’t need social media’s help to attract them.

My second objection concerns the idea that social media is harmless. Consider that the ability to concentrate without distraction on hard tasks is becoming increasingly valuable in an increasingly complicated economy. Social media weakens this skill because it’s engineered to be addictive. The more you use social media in the way it’s designed to be used — persistently throughout your waking hours — the more your brain learns to crave a quick hit of stimulus at the slightest hint of boredom.

Once this Pavlovian connection is solidified, it becomes hard to give difficult tasks the unbroken concentration they require, and your brain simply won’t tolerate such a long period without a fix. Indeed, part of my own rejection of social media comes from this fear that these services will diminish my ability to concentrate — the skill on which I make my living.

The idea of purposefully introducing into my life a service designed to fragment my attention is as scary to me as the idea of smoking would be to an endurance athlete, and it should be to you if you’re serious about creating things that matter.

Perhaps more important, however, than my specific objections to the idea that social media is a harmless lift to your career, is my general unease with the mind-set this belief fosters. A dedication to cultivating your social media brand is a fundamentally passive approach to professional advancement. It diverts your time and attention away from producing work that matters and toward convincing the world that you matter. The latter activity is seductive, especially for many members of my generation who were raised on this message, but it can be disastrously counterproductive.

Most social media is best described as a collection of somewhat trivial entertainment services that are currently having a good run. These networks are fun, but you’re deluding yourself if you think that Twitter messages, posts and likes are a productive use of your time.

If you’re serious about making an impact in the world, power down your smartphone, close your browser tabs, roll up your sleeves and get to work.


Cal Newport is an associate professor of computer science at Georgetown University and the author of “Deep Work: Rules for Focused Success in a Distracted World” (Grand Central). This article is Copyright, 2016, by the New York Times.

Posted in Blogging, Business, Counseling Concepts, Dissertation and Research, General Musings | Comments Off on Quit Social Media – PLEASE!

I was Wrong, But There Isn’t a Coattail in Sight

maxresdefaultI was wrong.  We awoke to the news that Trump had won. I am still digesting what occurred, but it looks like Donald was able to turn several important regions his way. The states of Wisconsin, Pennsylvania and Ohio, for example, tilted heavily in his direction and I think it is because he spoke to the latent desire of people to have work to do. Nearly 100,000,000 Americans do not work – they are not in the workforce. They live on savings, on the earnings of spouses, and on the government dole. The inherent nature of man is to build, not to sit around. Trump tapped this quite effectively.

However, I still think it was Hillary’s election to lose and she did so by demonizing millions of Americans who supported Trump when she called them “deplorable.” That resonated far and wide. She also did herself no favors by running on Obama’s record, which is seen as lackluster at best. And, finally, the lingering sense that she broke any number of laws, including the famous “email server in the basement” nonsense. Anyone else would have lost their job and possibly gone to jail. I also think it was a mistake to let her husband campaign (even though I thought Bill did an excellent job governing our country). He brought up old wounds, including the whole of the “blow job in the Oval Office” stuff and the idea that they, the Clintons, were somehow above the law.

Let us speak truth to power, however. There were no coattails. The Republican majority in the House, still intact, declined by 13 seats. In the Senate, it dropped by one. Trump has no mandate. None whatsoever.

I voted for Trump, which is to say that I ticked the ballot box for Trump. However, truth be told, I voted against Hillary. I simply could not get past the character issue, even though I don’t for a minute think that Donald is an angel. I suspect, but will never really know, that many people did precisely the same thing.

This then brings us to the divide in our country, which is now precisely 50-50. If it turns out that she won the popular vote total, it will have been by the slimmest of margins. And that will only highlight the permanent division in our country. Look for increasing calls to split the country up. The east and west coasts went decidedly for Clinton, as they did for Obama and Gore and the first Clinton. They will never give up.

The election puts the lie to “big money politics,” Trump’s wealth notwithstanding. He spent an average of $5 per vote, while Clinton spent nearly four times that amount. He stayed on message and ran as much for himself as he did against Obama (which is to say, he made it clear that Clinton was in effect running for Obama’s third term). What her message was had never been clear to me. She spent her money reifying her base of voters but forgot about the tremendous numbers of people who had not made up their minds. Trump’s message was simply the better marketing plan.

Life will go on. The Congress will still remain somewhat of an impotent force. Trump will take months if not years to learn the levers of his control and power. As with Brexit, the markets will react negatively (they don’t like change) but will bounce back. The global economy will stutter along. Whatever Trump can do will take years to take hold. And he may then lose re-election in four years. Who knows.

Clinton has an obligation to fade away, but something tells me that her ambition to power knows no bounds.

Posted in General Musings | Comments Off on I was Wrong, But There Isn’t a Coattail in Sight

Some Musings on a Decidedly UN-Presidential Election

Clinton Trump Johnson and Stein

Clinton Trump Johnson and Stein

My prediction: Trump wins the popular vote while Hillary wins the electoral college. And the popular vote will be split nearly evenly. Hillary, of course, will claim a non-existent “mandate,” while Trump contests the battleground state votes through legal means. Meanwhile, the Congress will shift slightly toward the Democrats but not enough to gain any sort of majority capable of supporting a President Clinton as she goes about governing. Investigations and even impeachment will commence almost on Day One (and I agree that they should) and our government will be hamstrung to get anything done.

Not that there’s anything wrong with that. What we do NOT need are more laws. We have enough. And we have enough national debt, a debt that arose from trying to do too much with too little.

But what has me worried the most is  the near-unstoppable forward motion of the bureaucracy. Absent any sort of adult supervision (or perhaps because of the wrong kind of supervision), bureaucrats have the unspoken mandate to continue promulgating regulations that no one voted for or against. A president without a clear mandate for change cannot do anything to stop the bureaupathalogs.  Or worse: A President Hillary will see bureaucracy as her only means of effecting whatever change she believes in, and will work around Congress in much the same way that her mentor, Obama, has done.

I am a member of that “basket” of citizens that Hillary labeled the “deplorables.” What a sad and pathetic way to begin your rule: by labeling half your people as deplorable.

And how decidedly un-Presidential.

Posted in General Musings, State of the Nation | Comments Off on Some Musings on a Decidedly UN-Presidential Election

The Key Word is “Affirm” – A Russonian Take on Ward v. Wilbanks

Ms. Julea Ward

Ms. Julea Ward

Julea Ward has become, sadly, a whipping girl for the American Counseling Association. And because it is so easy and intellectually unchallenging to do so, demeaning her publicly has become something of a cause célèbre for the faculty in many counselor education programs.  I say “easy” and “intellectually unchallenging,” because I believe that few of those educators have taken the time to read the opinion of the Sixth District Court which ultimately remanded the case for further consideration. Had they done so – that is, had the faculty taken the time – I doubt that they would be so quick to demonize Ms. Ward.

Let me begin by explaining the case.

Julea Ward was a Master’s student in counseling at Eastern Michigan University (EMU) over a period three years in the 2006-2009-time frame.  As with all accredited Master’s programs in counseling, EMU’s required Ward to enter what we call a practicum class, during which she would see real life clients as a counselor-in-training. I went through similar practicum training and I must tell you it was among the highlights of my program. The chance to help people while receiving real-time training and feedback is an honor unlike any other.eastern-michigan-university-a55cf116462fe7

Of note are EMU’s policies and procedures for Master’s students to follow as they progress in their program. Fairly standard, boilerplate-type stuff, the policies are full of what you might expect; things like professionalism, attendance, good grades, graduate student decorum, etc. Mine was no different. However, EMU had an interesting requirement that I do not recall from my program, that of requiring its students to “affirm” a client’s values during counseling sessions.

Ms. Ward had repeatedly told her professors that she could not and would not “affirm” a client’s same-sex relationship. She had similar reservations about “affirming” conduct such as extra-marital relationships. She made it clear, repeatedly, that her faith (Christianity) prevented her from doing so. Moreover, she made it clear that she would never “affirm” problematic behaviors such as drug use, suicidal behavior, murder … etc.

Mr. Ward also said that she would be more than happy (indeed, honored) to help anyone with their struggles in life up to but not ever including affirming behavior that she felt violated her beliefs. This is critical. To my mind, it changes everything we have been “taught” (read: had rammed down our throats) about the essence of Ward v. Wilbanks.

Anyway, and as you might imagine, her stance and her overt allegiance to her faith did not sit well with her “educators.” Nevertheless, her stated convictions did not seem to matter to the faculty and she was permitted to continue to take classes toward a Master’s degree (at great cost to her and perhaps even as she ran up student loan debt). She entered the last stage of her program with a very good GPA and (apparently) the forward motion engendered by successful academic performance.

In short, Julea Ward proceeded while her professors waited in the weeds.

They knew that a trap would be sprung on this stupid, God-fearing Christian. I’d even bet that they even conspired among themselves to make sure that poor Julea was presented with precisely the kind of client that would get her dismissed from the program.  No surprise, therefore, that when she suggested the client be referred, Julea was dismissed summarily. She then sued EMU, and we were given Ward v. Wilbanks, the latter a member of the faculty at EMU.

One can only imagine the sanctimonious group of faculty who gathered to pass judgment on Ms. Ward. They undoubtedly took the time to carefully and ever so intellectually weigh the needs of the profession and its code of ethic against Ms. Ward’s heartfelt conviction. They even went so far as to assert that the discipline being considered against Ms. Ward was in somehow in Ms. Ward’s best interests.  Their heartfelt convictions, you see, took precedence over Ms. Ward’s.  Mao would have been proud.

There was only one problem: She had not violated any code of ethic. She had reflected on her limitations as a counselor (something our code requires us to do) insofar as the required “affirmation” was concerned, and concluded that a referral to another counselor was in order (again, something our code of ethic requires us to do).  Before that, she went above and beyond our code by offering to see the client until and unless referral became necessary. In other words, it was entirely possible that the “affirmation” would never have become necessary. It was the school (EMU) that jumped to conclusions and made the referral mandatory.

Is it me, or does this smack of a set-up?

star-chamberThe star chamber of “faculty” told Ms. Ward that her behavior had violated two provisions of the American Counseling Association’s code of ethics by (1) “imposing values that are inconsistent with counseling goals,” and (2), “engaging in discrimination based on sexual orientation.” Let us examine those in turn.

First, the ACA code of ethics does NOT prohibit values-based referrals like the one Ward suggested. Consider the entire text of the relevant section of the ACA code:

Counselors are aware of their own values, attitudes, beliefs, and behaviors and avoid imposing values that are inconsistent with counseling goals. Counselors respect the diversity of clients, trainees, and research participants.

The district court asked, “What exactly did Ward do wrong in suggesting the referral?” If anything is clear, it is that Ward was acutely aware of her own values. She made those values clear in repeated discussions with classmates and professors. It could therefore be argued, as the court in fact did argue, that Ward’s referral suggestion was made to avoid imposing her values on homosexual clients. Her referral suggestion not only respected the diversity of practicum clients, but it also conveyed her willingness to counsel gay clients about other issues – everything except the required “affirmation.” It seems that that posture was confirmed by her equivalent concerns about “affirming” heterosexual clients about their extra-marital sex and adultery.

Second, the ACA code has this to say about discrimination:

Counselors do not condone or engage in discrimination based on age, culture, disability, ethnicity, race, religion/spirituality, gender, gender identity, sexual orientation, marital status/partnership, language preference, SES, or any basis proscribed by law. Counselors do not discriminate against clients, students, employees, supervisees, or research participants in a manner that has a negative impact on these persons.

The district court again asked, “What exactly did Ward do wrong?” If anything, it was the “faculty” who were violating the code in their treatment of Ward. Beyond that, we must ask,” Would the faculty insist that a Muslim counselor tell a Jewish client that his religious beliefs are correct – that is, AFFIRM Jewish religious beliefs — if the therapy discussion took a turn in that direction?”  Would the faculty require than an atheist counselor tell a person of faith that there is indeed a God? (if the treatment would suggest such a turn?).

The facts of the case also highlight that the referral did not have a “negative impact” on the client. Quite the opposite, as the client never knew about the referral!  Indeed, the client may have received better counseling than Ward could have provided.

The ACA filed a brief in this matter.  I see this as a waste of the dues I pay the ACA, but aside from that, they exposed their political agenda.  The ACA, you see, has weighed in on assisted suicide in a profound way. They even went so far as to carve out an exception to the prohibition on values-based referrals by creating a code of ethic section which reads:

“If a counselor chooses to not work with terminally ill clients who wish to explore their end-of-life options,” then pursuant to this code section, the counselor may refer.718px-flag-map_of_oregon-svg_

Many counselors, me included, are not comfortable with suicide, assisted or otherwise. If my state legislators were to debate an assisted suicide law (so-called “death with dignity” statutes), I would be writing letters asking that they defeat the measure. I am against what states such as Oregon have done in permitting physician-assisted death. But that is my political position. I would never ask that my professional association adopt my political position. At the same time, I am offended that they have adopted one that is totally counter to my political position. I would have preferred that they simply stayed out of it.

[Parenthetically, as I understand the law (and I have read the relevant statutes) in Oregon, the law was passed after considerable pressure applied by physicians who wanted immunity from legal action if they were to assist a suicide.]

EMU was its own worst enemy. It turns out that they repeatedly permitted referrals for ostensibly values-based reasons. In one instance, they permitted a counselor-in-training to refuse treating a client suffering with bereavement because that student was herself suffering from issues of grief. The university clearly avoided assigning clients into what would have been unsuitable match-ups. Why did they insist on treating Ward any differently?

Answer: Because her conflict arose from firmly held Christian beliefs.

An even worse answer?

Because her convictions conflicted with the “faculty’s convictions.”

One last point: EMU accused Ward of hypocrisy, saying that she had demonstrated a willingness to treat clients suffering from issues surrounding abortion, child abuse, and murder. But in all those cases, the University did not require that Ward “affirm” the values underlying the conduct.

The problem boils down to that simple word, “affirm.” Had EMU exorcised that word from their policies, perhaps things would have turned out differently. I would add that “affirmation,” in the counseling application of the word, is all about unconditional positive regard. It does not require that I agree with the problematic behavior. Not at all.

unnamedThe timeless question is this: “Would you have a problem counseling Adolph Hitler?” My answer has always been, “No, no problem at all. I doubt he would present for treatment, but if he did, I would have no problem trying to help.”

Just don’t ask me to “affirm” his values-based behavior.

Posted in Blogging, Business, Counseling Concepts, Dissertation and Research, General Musings, State of the Nation | Comments Off on The Key Word is “Affirm” – A Russonian Take on Ward v. Wilbanks

A Land Where Process is King

process-kingI wish this were a fairy tale, but alas it is not.

Imagine a world where process is king. Where process has ruled for 100’s of years. Where process and its princes have successfully subordinated outcome to indentured servitude status. Where even the most faulty outcomes are acceptable provided King Process has been obeyed.

So that your imagination might be enriched, let us put King Process on a throne. And not just any throne, but one made of heavy granite blocks, quarried from the nearby hills and plains. The blocks, all of the same texture, size, and color, are immovable and allow King Process to sit high above the multitude, always a reminder that process was to be obeyed above all else.

Blood lines being what they are, Kings would come and go but they were always of the same deportment and ego: They were raised to know (or, at least believe) that once ascended to the throne, their world of process would reign supreme. Damn the outcome! Off with its head! Outcome is beneath us!

Occasionally, there would ascend to the throne a King Process who understood that outcomes were not dispensable, or at least not as discardable as previous Kings had treated them. We might look upon these Kings as enlightened, having witnessed a need for change and for the outcomes that supported that change. These Kings might have seen previous Kings of Process as being too much the anchor and not enough the engine.  After all, the rabble of students and staff and other plebeians could be seen as wanting an outcome of eventual value to the throne. The enlightened kings would first attempt to have some granite removed from the throne, in order to lower it closer to the undulating masses. But the stones, the building blocks of process, could not be removed without weakening the throne for future kings.

Don’t lower the bridge; raise the river!

So they decided to raise the status of the plebeians. Not too high, mind you, but just enough to quiet the militant rants and to make them easier to hear. The king would insist that the plebes follow a process that fed into their process – process upon process, subordinate process uploaded into the superior process – and only then would the king accept the outcome of the now-raised status of the rabble. Parenthetically, the outcome would now become the King’s outcome; it was his idea all along, you see.

Process upon process. And the “franchise” awarded by the king to the plebes could be revoked at any time, especially should the kingly process not be honored.  “We will allow you to rock the boat, dear Plebes,” the Kings of Process would say, “but do not think we will allow you to topple the King of Process.”

Sound familiar? It should. We encounter it all the time. Think of a time when you have wanted to challenge a status quo (we therapists call it homeostasis) only to be rebuffed and told to follow a process.  Your response might have been, quite justifiably by the way, “but it was the process that got us here!” In point of fact, you were wanting to challenge the process, only you saw it terms of the outcome, the product of the process.

Recently I was asked to address an outcome – indeed, to create an outcome – only to have my attempts rebuffed by the Kings of Process. I approached it by naively assuming I was at the same intellectual level of the Kings, and proceeded to work toward the outcome, the product if you will, thinking, “how could anyone object to such a wonderful product?”

imagesAh, but since I had presumed to be an equal, to have the same degree of vision and oversight as those atop the Throne of Process, I was to be disabused of such pretense.  In Australia they call it the Tallest Poppy Syndrome.  I was, in effect, a poppy that was about to grow too tall, and I was to be cut down.  They tried to wield their hedge-cutters, their scissors, to whack me back down to size.

They failed.

Think of your own organization. Have you allowed a Royalty of Process to reign supreme? Is yours a land where all the poppies, except those in charge, are of the same or similar height? Do you permit some to wield weed-whackers of their own egos and cut the ascendant down to size? Self-reflection time, children: Do you yourself whack people down to size?

To grow any organization is to fertilize a field of poppies any one of which might grow taller than the others. To whack the tallest poppy is to ensure a field of mediocrity and inaction. The wielding of scissors is bullying, pure and simple.

Those who can, do; Those who cannot, bully.

Think about it.

Posted in Blogging, Business, Counseling Concepts, General Musings | Comments Off on A Land Where Process is King

The Orb and Its Enemies: Toxicity Squared

the-orbThere once was a company called The Sharper Image. It had brick’n’mortar stores, typically in airports, but now lives on as a web-based catalog. They might have stores still – I don’t know – but the point of this story is my memory of a fascinating product they once marketed, a product called The Orb: The Self-Contained World. 

Back when I was a traveling executive, I would while away the airport hours by shopping the Sharper Image.  I would usually buy some little trinket – stupid stuff, the kinds of things that lost whatever allure they had by the time the next flight ended. But it was The Orb that fascinated me the most. Sadly and because of its heft, it wasn’t something I could buy and take home.

The Orb was a self-contained world.  It was an aquarium into which you would add water, the little packet of sand that it came with, some twigs of a plant designed for self-contained worlds, and then a packet of microbes or some such thing (probiotics I suppose).  It came with a coupon for obtaining by mail-order whatever live animals you intended to throw in, but not fish. These “live animals” were of the kind that grew to a certain size and then no more. They stopped growing, perfectly content to live in their self-contained world. So, they weren’t what we would recognize as fish, but they were big enough that you could not miss them there in the water.

Anyway, you would get it all set-up, add the animals (whatever they were) after the mailman arrived, then screw the top down and wait for eternity. It would in time achieve some sort of homeostasis, being that it was a self-contained world, and then just, well, sit there.

Nice to look at but with not much going on.

Because it was self-contained, there was nothing for its owner to do.  It would generate all the nutrients required for sustenance. All you had to do was look at it. And, never, ever, open it up and put in something new. NEVER, for that would disturb the homeostasis. The Orb would react like a human body rejecting a new liver, contorting and twisting to eject and reject the new-whatever. Remove the new object and The Orb™ would return to its steady state.

What had happened, of course, is that The Orb had become a toxic environment. A nice place to live, but you’d never want to visit. Does that make sense? God forfend that you should happen to fall into The Orb! You would be ejected like Susan Sarandon at a Clinton rally. The Orb was for its original inhabitants only – susan-sarandon-loreal-paris-zoom-75a2019a-9dd3-4c3f-a053-2a68bbd91e63nothing new, no one new, no new ideas, no new growth.

Sound familiar? It does to me. I’ve lately experienced being tossed out of an Orb.  Flung out and shaking off water like a Sheltie just showered, I looked back and saw The Orb quickly reasserting its homeostasis, trying to act like nothing had happened.  Its three little animals – whatever they are (I didn’t have time in The Orb to examine them up close) – jabbering to each other as if to say, “What the hell was THAT? How dare he come into our world! Aren’t we glad we have each other?”

In family systems theory, we are taught about the trials and travails of the typical family and the contortions it goes through to welcome a new member (let us call that “memberment”), to eject or lose an old or unwanted member (let us call that “dismemberment”), and the tendency of some family members to lose their individuation and to join what we might call the “undifferentiated ego mass.” It is this latter thing, this handful of goo we call an ego mass, that is perhaps most problematic: It hints that unless you are willing to lose your individualism and meld into the goo, you will be ejected. You will be dismembered. And, for that matter, you will be dis-remembered too.

And should you decide to join into the goo, you will have lost yourself, perhaps like Dante forever. Of course, that won’t be apparent to you: Looking at the others in the goo, in The Orb, will be like looking in a mirror. 61-6151-4jag100z

Self-awareness and repetitive, recursive self-reflection, are needed in order to guard against having your small group become an Orb. Examine how your group – your family perhaps, or your work group – may have become a self-contained world, unwelcoming of new ideas and new members. Hold up to the light of day the manner in which you have created that toxic environment. Look for the signs that you, like those animals that came via mail order, have reached a certain size and then stopped growing.  Have you stopped writing? Have you stopped researching? Do you and your fellow Orb inhabitants work together to draw people into the goo and force them to abandon themselves in favor of the group?

Racism, sexism, and ageism are daily reminders, evidence if you will, of goo.  We want people in our group who talk like us, squat like us, and look like us, and unless they change (enter the goo forever), we will reject them and shun them like they are a kind of kinky bacteria.

Does your group only add members that are, in effect, duplicates of what arrived via mail order? Do you hire in your own image?

I hope not. It is tough to be rejected. Jut ask the liver.

Posted in General Musings | Comments Off on The Orb and Its Enemies: Toxicity Squared