“The limits of my language are the limits of my mind. All I know is what I have words for.”
– Ludwig Wittgenstein, Philosophical Investigations, 1953
As anyone who has an inkling about my leisure activities may suspect, photographing butterflies counts high among my ideas of fun. Never a purist in the field, I usually also capture images of wildflowers, shrubs, birds and, even, interesting patches of dirt. Such inclusions notwithstanding, it is butterflies that pull me armed with camera away from the homeplex and into the wild.
This is not to suggest that I am trained as either a photographer or a lepidopterist. I am not. Everything I know about photography results from clicking the shutter and examining the results, over and over. Everything I know about butterflies has been gleaned in backyards and prairies and along the edges of woods, or by poring through field guides, most often at night, when I compare images in multiple reference books to the mixed results of my exploratory photography.
Exploratory photography is as apt a label as any may attach to my method. Even with the best available correction, my eyesight is poor. While acute vision is not really critical to negotiating a vast range of life experiences, I have thought often that it might be a handy thing when attempting to identify which species just fluttered past in an open meadow. For better or worse, I do not have this option. My substitute is a digital camera with a zoom lens. I may not be able to see details of wing pattern on an insect eight feet ahead, but I have a quick eye for motion and color, and a pretty good set of reflexes. Digital photography affords a practical method for “collecting” butterflies to identify at leisure without harming the insects.
Given hours outdoors sneaking up on critters, even a nearsighted amateur eventually notices characteristic behaviors of different species—not least behaviors that impact a photographer’s close approach. All kinds of yellow butterflies and, possibly even more so, white ones tend to beat hasty retreats at any suggestion of movement in their direction. Red Admirals, by comparison, seem positively gregarious. Some, like Common Mestras or Fiery Skippers, are indifferent to gigantic camera-toting galumphs until a move is made specifically in their direction from close at hand. Question Marks may be indifferent to photographers, but one will never know. A good candidate for ADHD’s poster bug, no Question Mark ever arrives at any spot in any manner except already poised to leave. It launches aggressive forays at other butterflies entering its territory, with flights of two or three or even more Question Marks zipping around in striking, balletic skirmishes.
Less rapidly sorted than differences in behavior between species are those between individuals of the same species. Like easily noted variations in pattern and color between conspecifics, variations in behavior between, say, Gulf Fritillary A and Gulf Fritillary B, are no less real. Hang around given habitats long enough, and even the wariest species will produce an individual detectably less wary than its siblings. The trick for a photographer is to get photos before the adverse impact of that lack of wariness serves up the happy-go-lucky specimen as a predator’s supper.
Observing the ways in which Gulf Fritillary A and Gulf Fritillary B act differently, one can be tempted to ascribe personalities to them. It is arguable that some conceptions of personality allow such a fancy, but those conceptions cast wide nets for narrow purposes, and do not refer to what I mean when I use “personality” with reference to qualities associated with self-aware individuals. However, we need not agree on a specific definition of the word before concluding that whatever else one may say about it, “personality” exhibits a complex of defining characteristics entangled with some degree of mind. Mind, let us not forget, while never co-extensive with brain, is a quality very much bound to brain—and brain is not something butterflies have in quantity. More able thinkers than I have foundered attempting to define mind. I have a few ideas on the subject, but for immediate discussion, a broad generality will serve. Mind is activity rising from the brain and nervous system, that is experientially linked to consciousness and subjectively interpreted as greater than simple electrochemistry.
A quick websearch showed that among top hits the brain size of a monarch butterfly is compared to the head of a pin (upper limit) and the tip of a ball-point pen (lower limit). Not much mass there for cerebral cortex. In fact (and I mean no diminution of Papilionoidea or their allies), had I bets to place on an entirely sufficient elucidation of butterfly mental state, my money would drop on simple, reactive electrochemistry. As tempting as it may be to call one fritillary lazy and another crazy, the truth springs more likely from variation in how critical proteins form in cells of their respective heads. Wondrous and fascinating is the palette of behaviors electrochemistry can produce. Of particular interest to a butterfly stalker is apparent recognition of intentionality. I have only anecdote as a basis, but it seems to me that most butterflies distinguish motion merely in near proximity from proximate motion directed at them. I cannot count the number of times I have looked away from photographing one butterfly to find another, typically standoffish, like a Southern Dogface or one of the other sulphurs, seeming to look on from the ground beside my foot. The instant my attention shifts to this visitor, however, the visitor decamps in swift, erratic flight. Somehow, electrochemical reactions identify when my focus shifts, and in doing so, cue retreat.
Spending time in butterfly habitats means I intrude also in the habitats of birds, and I have been known to grab photos of these subjects when opportunity arises. Birds, of course, have brains many times more massive than those of butterflies. On an intelligence scale for animal life, birds (chickens probably excepted) are pretty smart. Some, especially among parrots and corvids—ravens, crows and kin—exhibit behaviors provocatively suggestive of mind. Even among less mental birds (chickens possibly excepted) the ability to detect intent is refined far beyond a butterfly’s electrochemical reflex. A flycatcher swooping and diving as it forages the airspace around a treetop is simultaneously aware of the large biped trundling around the far side of the meadow. The bird reacts when the biped’s gaze fixes upon it, and reacts again when the bidped points some object its way. The nature of these reactions has much to do with distance between flycatcher and biped. If the two are separated by enough space, the bird will maintain a higher state of alert and either go on with its foraging or watch to see what the biped does next. If the biped is within the flycatcher’s boundary of safety when the pointing occurs, the bird cares nothing whether the object directed its way is a camera or gun. It departs.
A flycatcher’s spectrum of reactions demonstrate interpretive ability more complex than the discriminations of a butterfly. Even if not mindfully, the bird judges what it observes and tailors its own behavior as a result. I say “Even if not mindfully,” because there are those who will object that flycatcher’s have no mind. That makes no difference to this discussion. Mindless judgment is a staple feature of many software applications, and computers are far less mentally flexible than birds. Given that the story of animal mindfulness must be rewritten constantly as each limit attributed to it falls in the face of research, I tend to think there is an element of mind involved. A bankrupt Cartesian faction may still cling to notions of a privileged (if no longer separate) creation for humans, may still try to limit allocation of mind to humans and “higher” animals. Their arguments do not persuade. Studies of recent years have found evidence of self-awareness, tool use, advanced mechanical problem solving and play in octopuses, which, to speak uncharitably, are free swimming, many-armed, naked cousins of clams and oysters. I suspect mindfulness penetrates widely across the animal kingdom, including far extensions at wavelengths, like those of radiant energy, imperceptible to our unaugmented senses. And I suspect that if we are honest about who and what humans are relative to our place in existence, then we must admit that research, if not our troubled hearts and appetites, has shown conclusively that humans are a species of ape and that the great apes deserve to be understood as kinds of people. Primatologist Jared Diamond has even argued that chimpanzees and bonobos should be included with us in genus homo.
Among traits that humans and apes have in common is a generally refined interpretive ability. This not only recognizes gross intention, but opens up an entire order of additional granularity in deciding what another individual might be about. Besides motion, gaze and pointing, humans assess expression, posture, nature of clothing or condition of pelt, objects in hand. etc. Rapid decoding of intention in others is a skill shared by hominids and a few other species, primate and otherwise. A remarkable thing about this skill is how sophisticated and powerful it is in a general sense. Humans, for example, are really good at determining at a glance—with one notable exception—the overall emotional state and attitude of another being. Long before we can formulate a verbal basis for the knowledge, we know if someone is angry, happy, threatening, gloomy, distracted, etc. Virtually the only general emotional condition most people either cannot recognize or cannot respect, is mental absorption. As critical to human achievement as the ability to think about a subject with single-minded focus may be, no other emotional or mental state is such a magnet for intervention. Surrender to rapt attention, and people will come out of the woodwork to probe what you have, what you’re thinking about, what’s up, what’re you doing, let me see, let me touch it, I can help….
Apart from mental absorption, however, humans, their near kin, and sentient peers are incredibly good at such rapid discrimination. In fact, we are so good at it that it is almost astonishing how bad we are at interpreting specifics of any given emotional state, and especially how absolutely incompetent we are at accurately ascribing motives. This incompetency is only almost astonishing because it doesn’t take much reflection to understand that what is genuinely astonishing is that we ever get it right at all.
The key here is that cues from others to which we react when we ascribe emotional state are informative only on a general level. Facial expression and body language supply nothing in the way of narrative explanation for the signals they send. Humans, however, are narratizers on a grand scale, and we tend to fill in gaps in information with details that feel consistent with what we think we know. A term from analytical psychology used to describe this mechanism is “projection.” Specifically, this is when one attributes to another person motives or traits rooted in one’s own psychological state rather than real conditions of interaction. In other words, one “projects” a narrative onto the other. The misattribution is often entirely unconscious, and it can have a snowball effect. The more one projects onto another, the more one thinks one knows about the other, and the more one’s understanding of the other’s apparent motive deviates from fact.
Projection is not a mechanism found only in rare individuals, nor is it something that can ever be entirely rooted out or overcome by determined consciousness. We may talk meaningfully about degrees of slippage resulting from projection, but never about its absence. To crib from Firesign Theatre, it’s “in everybody’s eggs.” If this is true in the case of interactions between persons actually present with each other, it is an even larger factor in mediated interactions. Recall how many people misunderstood that status update you thought was transparently tongue-in-cheek? Sadly, it seems that the greater the stake a recipient has in a given communication—even an apparently information rich communication—the greater the likelihood of projective misattribution. On occasion, the result is innocuous; on occasion, it is scorching.
Recently, during a bout of mental disorder, I placed profiles on a couple online dating sites, to illuminating effect. Sorting out several dozen utterly inappropriate responses, I made connection and had brief correspondence with a very few engaging candidates that ended on a friendly basis, and I actually met one attractive, charming, intelligent woman with whom I simply did not share sufficient basis for the kind of relationship she sought. Also among the respondents were a number of wacko projectionists. These last were identifiable sooner or later by the criticism they voiced about the “truth” they thought they perceived behind details of my profile. A couple did not raise the negatives revealed by their analysis until after we had exchanged a few messages, but several contacted me for no other reason than to criticize one or another detail that rubbed them the wrong way. They did so with notable lack of kindness. To me it seemed that in every one of these cases the projectionist had not only misunderstood whatever statement had riled her, but, further, was unequipped to give the benefit of the doubt to any detail she failed to understand. Either that or I am a completely messed up mental case unable to appreciate the favor I was being shown by having my lapses pointed out to me. Whichever is true, these experiences influenced me to begin to ask whether the initiative was one I wanted to continue. When a friend asked if I thought I even had time for dating given current constraints on my time and resources, scales fell from my eyes and I terminated the offending profiles.
In the wake of this effort to conjure social connection through the internet, I found myself mulling over the treacherous interplay of perception and projection. Unsurprisingly, having the subject prominent in my thoughts sensitized me to examples day by day, and not just with interactive mediations. Projection may swamp perception entirely when reacting to a feed that is one way, as in the case of celebrity. Public reaction to individuals in the news, whether for newsworthy and/or newsbaiting activity or for mere spectacle (manufactured or otherwise), is governed at least as much by interpretive projection as by factuality. Even given exposure as heavy as that of reigning figures in popular culture—rock stars or film stars or stars of new media—the amount of information revealed about such people is so slight (and the amount assimilated into public discourse inevitably less), that they exist for us as surfaces painted by the freeze frame of public situations, the snapshots stripped of depth and authenticity by the selectively interpretive media lens. Charlie Sheen, Lady Gaga, microcephalic first graders able to miraculously parrot conservative evangelical interpretations of the synoptic gospels, all are largely blank screens about whom our attitudes are shaped as much by how they are framed, how lit, how described, how juxtaposed to other icons, as by what we convince ourselves we know about them. The most public figure on earth is largely a cipher onto whom we project qualities, details and assumptions that we—as individuals and as a society—need, wish or fear to be the actual case. Having done this, we forever after confuse the identity of the human being somewhere at the root of the icon with the emblem we create and to which we affix the icon’s face.
Substrates onto which we attach our prejudices, the human beings who wear media personalities like burqas seem to excite projection in inverse proportion to whatever factuality may accrue to them. Thus, the pop culture icon about whom we know the least—say, a frothy, popinjay singer adored by a subculture not our own—inspires us to construct a portrait substantially colored by the content of our personal imperatives and the assumptions we make. Inevitably, the most thoroughly constructed portraits, being those most in agreement with our assumptions, tend to become those most fervently and unshakably held. This represents a significant deformation in our perception of reality, which should be reason enough in and of itself to adopt a measure of humility regarding the views we hold and the certainty we attach to them.
Of course, the mechanism of projection does not cease to operate when it crosses beyond those humans whose most perceptible relation to us is utterly symbolic. Individuals transmogrified into emblems by popular culture remain human in line and proportion, and humanity is something about which most of us have at least a clue. If we step past human beings to subjects about which we know nothing whatever and about which nothing can be determined, we find the mechanism of projection employed overtime, airbrushing information voids with supposition, wishful thinking, fears groundless or otherwise, random conceptual recombinations and outright invention. In no case is this more apparent—or more vehemently denied—than in the case of so-called “ultimate” questions we ponder as sentient creatures curious about our origin and end.
Why do things exist? Where did existence come from? Why do we exist? What happens when we cease? Every answer proposed for these questions is purely conjectural. There is no scrap of empirical support for any of them. Does existence exist? I think from a consensual view we can give that one a resounding YES! Can we make with certainty any verifiably factual statement about why it exists or from whence it emerged? Not given current limits on technological, perceptual and conceptual resources, and likely never. Does this stop us from trying? Not for a heartbeat.
On the evidence I would venture that if people conjecture more about a subject the less those conjectures can be verified, the subject upon which the most extravagant conjectural flights are launched is that about which absolutely nothing may be stated with certainty. I grope for a term appropriate to describe the impetus behind the agency that drove existence into being. Prophet after prophet has reacted to the utter silence and utter lack of face of that impetus by vomiting aspirations personal and social to replace the absence of information with an artifact that just so happens to look exactly like one would think a revelation from God would look. Naturally, these wholly insubstantial architectures of belief are those maintained with unyielding conviction. Especially bitter is the divide between the misguided camps of the theist vs. atheist dispute.
Not to give atheists a bye, but I particularly dislike the word God, capitalized and singular, waved like a mighty cudgel by many of the theistic party. An atheist may claim as a statement of fact that there is no God, but the statement, although as untestable as its opposite, states a negative, which means it asserts nothing about the nature of God’s nonexistence. When theists, on the other hand, use the word God, the mere fact of having a label (particularly such a privileged label) leads them to a view that they know something concrete about the invisible, ineffable intangibility to which the label is pinned. Lurking somewhere in the act of naming a purely conceptual thing is magic, a glamor that allows the unreflective namer to behave as though the concept has taken on mass and texture. The weight of generations of theists making bald assertions about the nature of God without a scrap of evidence reinforces the erroneous notion that unsupported claims intersect actual knowledge. In fact, and the only verifiable fact attached to theistic notions, is that even if God exists, nothing can be known about Her/Him/It/Them. No authorial trace emerges in company with Being through the veil that separates us from the cause of which existence is a consequence. Even if we posit God for the sake of argument, there is no attribute of God that is observable, much less identifiable, from within our finite, limited perspective. Anyone mouthing words about God’s will, God’s word, God’s plan, God’s pleasure or displeasure, God’s love or lack of it, God’s gender or any other purported characteristic of the divine, is talking pure smack. Even if the language faithfully reproduces sacred trash talk of earlier generations, it has no objective authority as description of divine attributes. A great tragedy of such claims is that humans have butchered more humans on the basis of beliefs about God-cum-transcendent-being than for any other instigation, and all those deaths lay at the feet of an understanding of reality built on foundations less substantial than gossamer.
We may take the big three attributes ascribed to God—omnipotence, omniscience and omnipresence—and turn them from ascriptions to definitions. In other words, we scale the discussion by defining God as that which encompasses all possible potency, all possible knowledge and all possible location. This does not move us one step closer to a transcendental agency. Instead, we have merely attached a label, God, to the set of all potency, knowledge and location. It may be a poetically mathematical analogy, but that does not make it objectively true. There is simply nothing about the ultimate, or first, cause of existence that can be known. The mystery of ultimate cause is impenetrable. It is difficult to stress sufficiently that no condition in the dancing space-time we share can provide inarguable certainty about first cause. Behead as many infidels as you like, burn as many heretics at the stake as fuel can fire, these acts cannot change the fact that citation of God as anything other than shorthand for the unyielding, undefinable mystery of first cause remains forever a fantasy. All the institutions posited upon certainty that God is a reality with an aspect comparable to person, not to mention all the prayer and all the rules, are, in the end, without basis. Ironic that a personate god is not even necessary to give basis and meaning to prayer, morality or communities of values. Certainly not to the stories we tell.
Of course, what God is—in lives, in the literature, in mythopoesis—is projection. This is why, despite the superlatives attached to all names and labels of the divine, God is never larger or wiser or more organically consistent than the imagination of any given believer. It’s also why the God of so much belief is vengeful, vindictive, petty, jealous, arbitrary, violent, autocratic, murderous and, in short, so capacious a portmanteau of humanity’s lowest behaviors. A mirror reflects only what is visible before it. Small wonder the hounds of God must resort so often to flame and blade to enforce their stone-aged creeds.
Is it possible to purge the word God of these contaminants? Can we shuck externally imposed regimes of belief from this half-teaspoon of phoneme and rehabilitate God as a convenient handle for mystery vaster and more profound than any possible conception? We may try, of course, and cleave to the usage in our hearts, but when it comes to changing significance of the word in speech, I am pessimistic. After more than 150 years of repeated explanation, thoughtful minds still have not managed to make clear to much of the general public that when the word “theory” is used in a scientific context, such as “a theory of evolution by natural selection,” it means something quite particular and other than when the Reverend Bill claims to have a “theory” about why God chose to test us by creating fossils in the ground, or when Uncle Willie has a “theory” why the government wants him to doff his foil helmet. Resistance to this elementary distinction has been intractable, and “theory” is not even a term central to non-scientific systems of belief. Cleansing the word God of encrusted doctrine will be a project incalculably more difficult. Personally, I make a conscious effort to avoid the word, deliberately choosing to use Goddess instead when I wish to make symbolic address to transcendence, and calling the great mystery of ultimate cause simply Mystery when wonder and yearning draw me to grapple with it. It’s much more difficult to paint the word “Mystery” with the lines of my own image than it is to do so with a word fashioned millennia ago into a battered human mask. I do not think trying to save the word God is even an appropriate focus for partisans struggling to liberate wonder. Let us concentrate on freeing wonder itself—which is done by every day, anew, freeing one’s mind—and the word God can come along if it proves it can keep up. What we achieve will not be a loss of capacity for belief, but loss of the lie that is certainty.
How to free one’s mind? A good beginning is always to attempt to short circuit that confounding mechanism, projection. One can never be wholly successful at this, possibly not even largely so; nevertheless, making the effort is always instructive. A good start is to strip God of the trappings of person, which, after all, is a construct of organic self-awareness inhabiting observable space-time, i.e. mortal beings. Like everything else in existence, personhood traces back to the mystery from which Being emerges. Personhood derives from the mystery; this does not mean that personhood is in any way cognate to a trait of some larger, non-emergent reality. Humans have personhood, apes have personhood, elephants and dogs probably have personhood. There is no credible basis for maintaining that God has personhood, or that talking of divine person is any more meaningful than speaking of the sea as a bicycle. Similarly, we have intelligence, but there is no basis for assuming intelligence as we know it resembles any possible trait of a reality by definition beyond comprehension. God has no detectable gender, no detectable will, no detectable mercy, no detectable interest in us as individuals whose fates bear more concern than the fate of any other phenomenon. The truth of this tries to surface in the credo of those sufis who defiantly shout, “God is nothing!” Whether the sufi claim has any more objective basis than another boils down to semantic manipulation of the terms.
From within the specific locus of dancing energy pattern that is me, there rises a suspicion that to speak of God or no God is beside the point. Both camps, I think, grub about in an effort to make distinctions about a realm that is in nowise distinct. Existence may be observed, and thus appear separable from the plumbless mystery that births it, but we have no way of knowing if this is really the case or a misconstruance rising from the limits of our perspective. Wishful thinking is not knowledge, no matter how desperately clung to. For all we are able to determine, it may be a fundamental error to understand existence as anything other than congruent to the mystery from it emerges. It may also be that the whole of existence is more like a sparkle struck from a wavelet by a stray sunbeam peeping onto a forest pool. A further, cosmic grade irony that humans bring to coping with transcendence is that despite the impervious unknowability of Being’s source, intimate contact of that mystery with every facet of our world has spawned a host of rigid, competing orthodoxies of certitude. Given the fatal intolerance that travels hand in glove with orthodoxy, the irony is grim.
Yearning, even desperation, for a rock solid absolute upon which to build certainty is an understandable impulse. A hunger for assurance lurks within most of us. Because it is understandable, it may arouse our sympathy, but being understandable does not make the impulse defensible. Neither does it give substance to what can never be more than conjecture. I also hold that those conjectures are not harmless, at least not completely so.
A line may be drawn from the butterfly reacting to motion, through the bird that flies whether gun or camera points its way, through the online prospect connecting dots of a profile in random arrangement, to the imagination that reacts to the veil obscuring first cause by manufacturing certainty from dew and aether. Each point on the line represents action in ignorance, with movement of the line away from the butterfly’s justifiable survival strategy, toward baseless assertions of God/Not God. These last are misguided when benign; when otherwise, they are too often deadly. In all cases, the assertions are wasted breath.
The task before us seems simple enough. Since uncertainty about the origin of Being cannot be gotten around and cannot be reduced, it is incumbent upon us to embrace it. If we require an absolute at which to wonder, that our world emerges from unknowable sources is superlatively wondrous. If we require a revelation of transcendence, we live embedded in the world, which is a tangible revelation unmediated by prophets or clergy or epiphenomenalists. The world is a revelation more directly and authentically connected to its genesis than any other. It is available to anyone at every moment and on every side. That our understanding of the world is partial and will remain forever incomplete should not inspire us to stuff our void of knowledge with clay idols of wishful thinking. Rather, it should impel us to couch our every thought regarding ultimate things in humility. This is never more important than when we contemplate our own place in the world and our relation to others. The stories we tell about why we are here and where we came from are not sacred because they reveal the source of Being, but because they suggest truths about us. Although we cherish those with which we are most familiar, we must yet remember that the stories of others are as good, and that those accounts we love are no less narrative homespun. If we can marvel at the mystery of existence without projecting onto it, if we can cherish our stories without mistaking them for fact, and if we can accept the stories of others as equally sacred, it may be possible to step beyond the limits of language, to push against the limits of mind, and thereby inhabit a reality no more exhaustively understood but, nevertheless, made larger.