Thursday, 29 March 2012

Touchstone Thursday: Philip Rahtz's 'How Likely Is Likely?'

By all accounts Philip Rahtz (1921-2011) was venerated as a human being and as a first-rate excavator. Based on this tongue-in-cheek probe of circumlocution in the literature [what have elsewhere been called 'bullshit qualifiers'], I have to infer that he was also a straight shooter when it came time to present his own inferences from the materials he recovered.
     So, he begins this note in Antiquity by recognizing the word-smithery that is characteristic of so many archaeology publications, and proposes a quantifiable scale by means of which an author can signal to the reader just how secure an inference is.   
Antiquity 49:59-61, 1975
However, Rahtz despairs that this would find widespread acceptance, given that the very people who are likely to use waffle words to present their inferences are unlikely to report the certitude of their interpretations honestly, even if given the opportunity to quantify them.
     So, he offers an glossary-style approach, and bequeathed to all who came after these hilarious paraphrases of commonly used expressions.
Antiquity 49:59-61, 1975
But wait! There's more! Even better. He can't forbear a similar glossary for the following eclectic aggregation of equally oft-heard phrases. I still can't read these without laughing out loud [er...LOLing].
Antiquity 49:59-61, 1975
I don't care what time or place you specialize in. You'll be able to appreciate Rahtz's brief candle raised in a heroic but inevitably futile effort to buck the trending wind of archaeological discourse [i.e hot air].

Tuesday, 27 March 2012

I'm Gonna Rot in Hell For This...

   I'm sorry. No. I'm not. This is scholarship? Craig A. Evans is the Payzant Distinguished Professor of New Testament at Acadia University in Nova Scotia, Canada [my home and native land]. This just in....dit de dit de dit di doo doo...[that was supposed to be Morse code, harking back to the days of the Pony Express and canal boats, which is where this ridiculous article belongs]. The headline reads
The Archaeological Evidence For Jesus (PHOTOS)
...claims that there never was a historical Jesus fly in the face of common sense and more than sufficient evidence.
As I approached the text of this article, I was guessing that there'll be some less than sufficient evidence referred to in this sensational piece of proselytizing. And, guess what. Nada. And it's a durned shame, too. I was looking forward to those photos!
     Obviously Dr. Evans is no archaeologist. He's also no rocket scientist, either. Apparently, finding evidence of the sorts of things and people referred to in the New Testament is sufficient to prove the existence of god--or, at least, of god's presumed chimeric son.
     The good Dr. Evans begins by laying out the epistemological underpinnings of his claims. 'I begin by explaining what archaeology is...' he says, 'the excavation and study of the remains of material culture.' Well. Whaddayaknow? Kinda convinces me of what we've always suspected, but were afraid to speak of. Biblical archaeologists aren't archaeologists at all--they're puffed up antiquarians and art historians with an axe to grind. They wouldn't know an archaeological inference if it flew right up to them in a blinding light and proclaimed the second coming of Lew Binford. 
     It's really hard to keep a straight face. Listen to this:
[archaeology] means correlating what we discover with relevant written records (such as the writings of the New Testament and the writings of Josephus, the first-century Jewish historian). It often means applying space-age technologies. It is hard work and it is very rewarding.
This sounds more to me like George W. Bush on the vicissitudes of being POTUS than an archaeologist opining on the empirical basis for the knowledge created from archaeological traces. To paraphrase: we find stuff, and it's just like the stuff they wrote about in the New Testament. That proves it's true!
     You really have to read this to believe it. Er. To find out how a real biblical scholar thinks. Land o' Goshen! They had buildings and tombs back then! Jesus is real. They had religious leaders! Jesus is real. Give me a break.
     What's that rumbling?


Bonus! Anthropology Makes Normal Human Interaction Virtually Impossible

I know I'm walking a thin line when I decide to comment on someone else's commentary on some issue or other. It reminds me of the old saying, 'Those who can, do. Those who can't, write about it. Those who can't write about it, teach.' However, after reading yesterday's 'Why the Cultural Conversation Should Never Stop,' cultural anthropologist Melissa Rinehart has me musing on the impact of anthropology in my own life. 
     Dr. Rinehart writes of the emotional investment that we all make, as anthropologists and as members of the human community. She's not the first, nor I the second, to reflect on the relationship between us and our subject. For Rinehart the subject is a living, breathing people who daily endure a legacy of pain and crushing despair visited on them by the waves of Europeans that invaded the 'New World' after 1492. Hers is, at best, the tenuous relationship between oppressor and oppressed. Her realization of that dichotomy, she believes, makes her a better person.
     By contrast, my subject resides in the deep past--beyond memory. My nominal subjects lived just before and immediately after our species became human in the way that you and I are, and which I believe is exemplified in this activity. As I write and as you read we are making sense of pixels that form 'letters,' which we interpret as streams of sounds that themselves bear no significance to the matter at hand other than that of the meaning assigned them by common consent over the long history of the language we speak. 
     Unlike the author, there is no social negotiation between me and my subject. Nevertheless, my research can enrich the lives of living people, no more or less than that of Melissa Rinehart. My life is, I believe, manifestly more human because of my understanding of culture, garnered from a decades long relationship with anthropology.
     Both Dr. Rinehart and I are motivated by our emotional attachment to our 'subjects.' As she puts it, her emotional involvement with the victims of Haiti's earthquake made her realize that
My professional contributions, whether publishing or teaching are more reflexive as I've come to recognize purist objectivity is unobtainable. I am more morally grounded and have come to recognize the inherent value of respectful relationship-making whether with students, consultants, friends, or colleagues. Only by engaging in more emotionally intelligent discussions is where I find true learning takes place and this recognition has opened many new doors in my life that I would have never imagined before.
Her awareness of the variety and resiliency of culture she thinks has made her a better human being. Some might see this as self serving. 'Gosh! If I didn't know all this anthropology stuff I wouldn't be able to see the layered and intersecting present and historical forces that abide in even a simple conversation. And if I wasn't so enlightened and emancipated I'd never be able to communicate my impressions to the non-anthropological community, which is where my insights could do the most good' [reminds me somewhat of the wizard Harry's 'participant observer' relation to muggles in the Harry Potter series]. I don't see this as self-serving. I think Dr. Rinehart is on to something. And I'm not even certain I could have put it into words before I read her blog, but I know I can now.
     Anthropology, and through it archaeology, has fostered in Dr. Rinehart, in me, and in my anthropologist readers a deep and abiding sense of the connectedness of all human beings. It sounds fluffy. I know. [Why don't you go hug a tree, Rob, and leave us to the real business of anthropology!] I truly believe that an anthropological perspective is something that chemists, biologists, engineers and business majors don't get, that psychologists and other social scientists and humanities majors may not even get. It's the profound understanding that, even though we can see differences between us and 'others,' even though we are entangled in our separate tapestries of meaning and understanding of the world around us (i.e. our cultures), even with all the linguistic and other impediments to mutual understanding, twenty-first century anthropologists have the opportunity to inform others of our collective humanity. 
     As a result I believe as Dr. Rinehart does, that anthropology gives us the opportunity to be a different kind of human--one that has the ability to change the circumstances of others of our kind. Whether those changes occur in whom we teach, whom we love, whom we work with, or whom we choose to understand better by walking in their shoes, those incremental changes improve the odds that, to paraphrase William Faulkner upon receiving the Noble Prize for Literature, humanity will not merely endure, it will prevail.
     Anthropologists can help humanity prevail over its dreadful history, its inglorious present, and its uncertain future, but only if we involve the rest of humanity in our work. Like Melissa Rinehart, I believe that it begins with talking, one to one, just as you and I are doing right now. This medium--call it what you will--is making global understanding more possible than it's ever been before. Let's not squander the opportunities that we, as anthropologists, have been given.


Monday, 26 March 2012

Lots On My Mind; Not Much Emanating...


 I realize that I've been mute more than I've been cute these past few days. Making my fieldwork travel arrangements has eaten up most of my time. However, today I'll be enduring the climax of the two-day ordeal that is the truly dreadful experience known as the 'screening' colonoscopy. [Larry, my dear old friend, will no doubt be 'liking' this blurt, since he's only recently been given an all clear after undergoing the successful treatment for the disease the fear of which has convinced me that I must now do this thing.]

As much as I appreciate the 'need to know' if my body is planning on accelerating my inevitable demise, the medical community couldn't have devised a less appealing regimen if what they wanted to do was encourage men to undergo the procedure. I know. I know. Whiner.

Finally seeing the good sense of doing this, I've decided, is part of a syndrome--my having a colonoscopy, I mean. It goes along with the rapidly approaching 60th anniversary of my birth, which, believe it or not, has me thinking more and more about the hereafter. For reals? Yep. It's been happening for years, and I haven't wanted to let anyone know, 'cause after all I'm an atheist, and I've always been proud of my intellectual side. And seeing the latter being pushed aside by the emotions associated with advancing age is not easy to bear. But, I have to tell you what it's really like. As I said, I've been wondering for some time about what 'life' would be like on the other side of 60 and beyond. And nowadays I'll find myself entering a comfortable and familiar space like my bedroom or my equally familiar but often happily avoided workplace and asking myself 'What the Hell did I come in here after?'

I'll see you on the other side...of the operating theatre I mean! I'll save that other 'transition' until much later, thank you very much!

Saturday, 24 March 2012

The Subversive Archaeologist: On the Road in July?

I've put up a page with news on my effort to raise the funds to get me to the Czech Republic for part of the 2012 season at Pod hradem cave.


I'd be honoured if you'd check it out. Click here.  
Then click below if you want to support the Subversive Archaeologist, your all-time favorite blog! 


Merci! Gracias! Thank you!



Triggering A Revolution: Archaeology and the Image of the American Indian

Profuse apologies for the tardiness of the blurt you're about to read. Certain...practical considerations have forced me to bend my will to other matters, all the while worrying that my subversive pals would feel like they'd been left out in the cold. More on those pre-emptive activities in due course. On to the Trigger.

American Antiquity 45:662-676 (1980)
All I can remember of the first time I read Trigger's 'Archaeology and the Image of the American Indian' is a blur. Partly it was because we were assigned so much reading at the time, and partly it was because American Antiquity insisted on using a type size and a line length that prohibited fluid eye movement from one line to the next, as you'll discover if you find the article and choose to read it. The kind of presentation that American Antiquity employed waaaaay back then was the sort that increases the reader's cognitive loading manifold, and thereby comprehension suffered. Needless to say, the experience for me was probably one you've experienced many times yourself--you read something and you realize you didn't follow what was said. And so you read it again to get more of its sense. 
     My hope is that despite these practical shortcomings you'll persevere. And if you do, and if this is the first time you've seen this work, you'll understand just why it's being presented as this week's touchstone. It's a story that can't be told often enough, to enough people, to have it sink in, and yet it's vital to the ethical and humanistic practice of archaeology in many parts of the world.
     Not only is Trigger's article a concise history of North American archaeology's contribution to the image of the Native American (or First Nations, if you're Canadian), it's also a creditable accounting of the intellectual trajectory of archaeology in North America from the 18th century onward. In both strands the author reminds us that the 'subjects' are worthy of our interest and consideration in and of themselves, and not, as has so long been the case, as an abstraction against which to compare other people and other times, or as laboratory specimens used in discovering cultural regularities that can be applied to the experience of the archaeologist--both of which denigrate the living descendants of North America's first and longest inhabitants. 
     Although this example of Trigger's primacy in this corner of archaeological scholarship is aimed at the view of North America's aboriginal people, having lived there and worked with many Australian archaeologists, it's the story of that country's stance with respect to its aboriginal inhabitants, as well. Likewise, I believe, in any case of colonial archaeology, irrespective of time and place.
     There aren't any tables or nifty illustrations in this paper. I have the vague sense that Trigger might have eschewed such forms of presentation if only because seriation diagrams and typologies, flow charts and migration routes, and so on, were in part responsible for reinforcing the view of Native North Americans as incapable of changing, culturally, without prodding from outside influences--either by diffusion or migration. 
     Although I'm certain it could easily have been written as one, Trigger's paper is not an admonishment. It's, as it were, a chronicle--nothing more. Of course its intent is to educate, but it never seems (to me, at least) as if he's lecturing the reader. I think his readers are very much left on their own to construct the take-home message, since Trigger's message very much demands that for each consultation or interaction between archaeologists and their contemporaries in the Native American communities a unique approach is needed--one that takes account of the particular, historic circumstances of the group whose archaeology is being engaged.
     There's no humour in this paper as there might be in something written by Flannery. And you won't find a Binford using the opportunity as a 'bully pulpit.' All you'll get from Trigger is a sense of his profound humanity and his deep regard for the people whose history he constructs out of their ancestors' archaeological traces.
    

Friday, 23 March 2012

Pulling the Trigger

I promise. I'll have new news and old news some time on this, the twenty-third day of March, two thousand twelve (which of course ended yesterday in the Antipodes, and will soon be over in Europe and South Africa. So, I'm technically in breach of contract already. So, sue me!
TTFN

How'd It Get To Be Thursday?

Profuse apologies. It's Thursday and I'm haggard at the heart, care-coiled, care-killed, fashed, cogged, and cumbered. There'll be no touchstone today. Night all.   

Monday, 19 March 2012

Mad (as in 'as a hatter') Monday: Blurting Out the Reality Checks, one at a time

If you post something on Twitter, it's called a tweet. If you post something using email it's called... wait for it.... an email. If you post something by snail mail it's called a letter. What, then, do you call something you post on a blog? Simple. A blurt! Or a bleat. Except that 'bleat' makes it sound like you're a sheep, and 'blurt' makes it sound like you're shy. In my case they're prolly both appropriate. So, I flipped a coin, and from now on I'm calling each article that I publish on the Subversive Archaeologist a 'blurt.'


My last blurt, as it happens, unleashed a torrent of comment. One, to be exact. And it appears as if Unknown completely misinterpreted my point(s) with respect to Einstein's thought experiment, at the same time as reminding me that the GPS system and other observations made over the years since 1905 are clear evidence that time dilation exists as a result of Special Relativity (well, not the theory, the physical process that the theory describes [or imputes]). All I can say for certain after my foray into the cosmological realm is that it's a good thing I'm not trying to establish and maintain credentials as a physicist or, for that matter *cough* a mathematician or philosopher of science.


My second-to-last blurt was not so much a criticism of the archaeologist who lately exhumed some of Mesopotamia's history, but of the means by which she and her colleagues 'gained access,' as it were. I thought it was very telling *cough* that they could only venture out in sight of one of the new mondo military bases that've been established not to house the soldiers who aren't still there, 'cause the U.S. apparently pulled out a month or so ago. What's that? They only pulled out the last of the combat troops? My bad. But, that's a good thing, isn't it? I mean, if the people in the bases aren't combat troops then the U.S. is no longer at war with Iraq. Right? OK. So, the U.S. has troops on the ground to protect the personnel and equipment at their shiny new mega-bases (and the Green Zone). Ipso facto they're not there to wage war against the Iraqi people or their neighbours [this sentence contains the elements of a teachable grammar moment, if you have any small children nearby]. So (he asked rather coyly) why are the bases and the equipment and the soldiers and the support staff still in Mesopotamia in great numbers? Occupied? Of course the bases are occupied! If they weren't occupied they wouldn't need the soldiers to protect the people and equipment they surround. Would they? What's that? Not the bases? The country. The country is occupied? Obvies! If it hadn't been occupied there'd never have been a war in the first place, stoopid! Sorry? Oh. Doh! Duh me! Iraq is occupied by the U.S. military because the U.S. has vital national interests there. I see. Kinda like Japan after WWII? No, huh? Unlike Japan, Iraq was a poor country ruled by a vicious tin-pot dictator. So, why occupy Iraq? [Can I please stop the dumb show, now, and get back to the point I'm trying to make as pointedly and as protractedly as possible?]


Professor Stone's intentions may have been, and no doubt were, as innocent as can be. And her hopes for a renaissance of home-grown Iraqi archaeology may also be heartfelt. But, her actions suggest that she and her colleagues are, at best, a little 'tone deaf' and don't realize the signal that they're sending, loud and clear.


Maybe I can draw you a word picture to better illustrate. It's an analogous situation, played out in a university setting, which many of you can appreciate. You could also change the titles and imagine a business context. 


I'm Professor John at WhoAre University. I have a graduate student whom I'm supervising. I've heard that she has a fabulous collection of old vinyl 78s from the early jazz era in America with one-of-a-kind recordings. I love that stuff, and it's really valuable. And because of that, no one I know has anything of the kind. One day I waltz on over to Ms. Q.T. and declare my interest in the music in her collection. And I suggest that she invite me over some time to listen. And just between you and me she kinda doesn't want to, but says OK, anyway. By about the fifth or sixth of my requests to come over and listen, she's desperate to say 'No,' and I'm not about to put her out of her misery. Because she can't say 'No.' 


Why? Why can't Ms. Q.T. just say 'No?' Probly for the same reason that she would have trouble refusing Prof. John's sexual advances. See the relationship between the two isn't equal. It's what the enlightened members of the academic and private-sector establishment have termed a 'power imbalance.' One side, Prof. John, is all powerful--wielding life and death in his palm. Ms. Q.T. is, well, powerless. Almost. She can always take the issue to the university brass and have Prof. John put in his place. The Iraqis? Where are they gonna go? 


Your guess is as good as mine. And, frankly, it doesn't matter if it's oil, or sand, or Ur's gold, or the Great Woolly's petrified faeces. Just being there is bad PR.
   

Sunday, 18 March 2012

Sideways Sunday: Einstein Errs?

Even I go off the track at times. But this is ridiculous. I've never gotten the whole time dilation 'thing.' As far as I'm concerned, anthropologically speaking [of course], time is something we humans have constructed to talk about processes: things that have a beginning, a middle, and an end. As an archaeologist who deals with immense time depth I can't imagine that the universe gives a flying hooh-hah about its passage. And so the square peg of 'time-space' never made it through the more-or-less circular hole in my head where knowledge usually manages to insert itself. The worst of it is that I'm not a mathematician. 

Arithmetic? Fine. Algebra? Barely. Calculus? Forget it. Einstein's math? Only in my wildest dreams. So, it's kind of funny, don't you think, that I should be worried about Special Relativity? [I think it's funny! And perverse, given my thoroughly innumerate self.] And so, what do I do about it? I end up rooting around in the most basic explanation for time dilation due to Special Relativity. And what do I find? Implications that I couldn't, in a million years, make sense of in a way that Einstein's theoretical progeny would understand. It's for that reason that I'm going to impose it on my subversive pals. See what you think. 
[Those of you with more than a sprinkling of physics might wish to disabuse me of any misconceptions in what follows. But please do it gently. My ego is, after all, fragile.]
Patent Office Clerk A. Einstein (ca. 1905)
Einstein’s Theory of Special Relativity* is often illustrated using an archetypal cartoon—a clock, attached to a train, traveling at or near the speed of light. This is a kind of ‘thought experiment’, a heuristic device with a long and venerable history in physical investigation. However, as an example of temporal relativity, the ‘moving train’ model can be seen to fall short of the claims that have been made on its merits, and the mathematical constructs that would seem to support it. As I hope to demonstrate in this post, it's because the observations necessary to support the claim would be unobtainable in the physical world (the same world, one presumes, that Einstein was trying to describe and explain with his theory of special relativity). In what follows I examine the assumptions implicit in this classic thought experiment, suggest what an objective observer would really perceive as the train sped past, and with elementary school arithmetic, demonstrate that this attempt to model Special Relativity fails to represent empirical reality. 

Imagine that a mag-lev train is traveling from left to right, in a vacuum, in total darkness, on a horizontal, linear path very near the speed of light. The only force acting on the train is that which propels it. The train carries a very precise clock that can emit a continuous stream of photons that exits the train horizontally at 90° to the direction of travel. In Einstein’s thought experiment there is a stationary observer; in my scenario the observer is a closed-coupled device (CCD) that is sensitive enough to discern the first photon emitted and each one thereafter. 


The train is traveling exactly 1.0 m/s below the speed of light, or 299,792,457 m/s. Thus, during one microsecond-long interval the train travels 299.792457 m (or about the length of three Canadian football fields**). In the standard story, a stationary observer sees the light traveling further in a given unit of time than would an observer on the train. By the time Einstein was mulling this over, most of the math to support the concept was already in place, in the form of the so-called Lorentz Transformation.*** Einstein’s contribution was to propose that the speed of light is a constant. With that in mind, he inferred that time must therefore be slowed down relative to the stationary observer, even as time passes normally for a passenger aboard the train.

Credit to
Suppose that at the very moment the train passes at 90° to our CCD, the photon stream begins. Because the speed of light is invariable, by definition that photon would take 1.0 µs to cover the 299.792458 m to our CCD. And here is where the experiment suffers its first set-back. By definition we would be unable to begin timing the train’s progress because, simple mechanics tells us, the photon that marked the precise moment the train passed would never impinge on the CCD. Rather, after 1.0 µs the first photon emitted would arrive at a point 299.792457 m to the right and 0.000001 m to the rear of the observation point. Thus, the CCD would not have detected the first photon. Even at this early stage of the experiment one already has difficulty reconciling objective reality with Einstein’s theory, because the experiment’s success depends on an observer seeing the photon stream at the moment the train reaches a point at 90° to the observation point. For any theory to be so at odds with empirical reality, and yet be so universally accepted, strikes me as odd. For, if you can't observe a phenomenon, how on Earth can you claim to understand its behavior?

Even if one were to shorten the distance of the observation point the reality would be the same. A photon emitted as the train passed a CCD only a millionth of a meter less than a light nanosecond away from the train (i.e. 0.299792458 m) would still miss the mark and would therefore be imperceptible. In this case, after a nanosecond, the photon would end up nearer to the CCD than in the previous example, only 0.299792457 m to the right of the observation point (or about the length of a northern European adult male’s foot).

In reality, for the CCD ever to ‘perceive’ a photon emitted at the moment the train passes, the CCD would need to be so close to the source (i.e. one photon’s diameter away) as to render the experiment, to all intents and purposes, meaningless. And, because any photon emitted in the manner described above would continue moving away from the observation point in two directions at or very near the speed of light, our CCD would be in the dark for ever thereafter.

At this point the reader might be tempted to say, “Well, this is just a thought experiment, after all. What matters is the concept.” In response, I would suggest that if the ‘concept’ cannot be replicated in the physical world, even in theory, what possible value could there be in the model as a representation of the claim that time is relative? 

Summarizing to this point. The photon emitted exactly at the time that the train passes the observer, and every photon emitted thereafter would be, in theory and in practice, imperceptible to any stationary observer, even a CCD capable of sensing individual photons. For a photon emitted from the train ever to reach the CCD, it would either have to be in two places at once, or be able to exceed the speed of light, perhaps by quite a bit. Thus, Einstein’s illustration fails to provide a compelling case for special relativity in a real world, and the mathematics that describe it must also fail to reproduce reality.

In every practical sense, to be able to track photons emitted from the moving train, our observer would have to be moving with the train, or be in two places at once. Clearly a moving observation point would violate the assumptions of the experiment, and an ability to be in two places at once would violate the laws of nature (Quantum Theory notwithstanding). How much faith or credence can we confidently place in Einstein’s experimental evidence of time dilation if it demands that light, itself, or matter, for that matter, behave contrary to physical limits?

Credit
Another manifestation of Einstein’s thought experiment involves a train traveling at speeds much more amenable to human perception, such as the TGV or the Bullet Train. In this alternative experiment, on board the train a beam of light is emitted from the ceiling, and aimed at the floor, such that it spans a distance of approximately 2.5 m. The observer on the train sees a constant beam of light. On the ground, the theoretical observer would see a blurred line of light that began at a point on the ceiling of the train and ended at the floor some distance to the right of its starting point. As the theory goes, the “distance” covered during the process would then be the square root of the sum of the squares of the horizontal and vertical components of the light’s travel, which is a number greater than the distance from the ceiling to the floor. As the theory of Special Relativity depicts it, the stationary observer sees that light has traveled further than it did aboard the train, because on the train it was vertical. Since the speed of light is a physical constant, and the distance traveled on the train is less than the apparent distance traveled in relation to a stationary observer, on Einstein’s account time aboard the train must have slowed down.
Yet, as I've implied above--in theory based on physical reality--for an observer to 'see' either the photon stream emitted by the passing train, or the photon streamed aimed vertically at the floor of the train from the ceiling, it would need to be on a conveyance of its own, have left a predetermined point to the left at a predetermined time, traveled the same distance at the same speed as the train, and converged at the same end point. In realistic terms, the moving observer would see the first photon emitted by the clock about half way from the photon stream's commencement to the end point of the journey, and the same observer would record the final photon emitted in the moment it converges with the theoretical light source. Notwithstanding the cataclysm that would result in reality when the two trains collided, I think I've made the point well enough. [And, lest you think that such experiments would be unlikely to occur in the real world, think of crash-test dummies and their circumstances, then multiply by infinity (or some factor just this side of it).]If, therefore, one “unpacks” the assumptions associated with Einstein’s thought experiment, it becomes clear that, regardless of the inertial frame of the observer, the “light” could not, and did not travel further in relation to a stationary frame of reference. 

There is nothing inherently wrong with performing thought experiments, or with ‘discovering’ properties of nature in one’s dreams, for that matter. It has been known to happen. However it is crucially important that the imagined scenario can be replicated (at least theoretically) in the real world. When Einstein’s mental formulation of relativity is compared with empirical constraints on observation, there is clearly no accord: infeasible observation, coupled with intractable issues of timing and the physical nature of light make this demonstration of Special Relativity appear fanciful, at best, fantastic, at worst. Granted, the existence of a train traveling at the speed of light would also be highly unlikely. However it is, at least, theoretically possible. A light generator on board is also a real possibility. Equally likely is a timepiece that could control photon emission. However, that is where Einstein’s concept and reality finally part company. After comparing the imaginary to the real it is hard to escape the conclusion that Einstein’s model of special relativity presupposes sensory abilities and light behaviors that are physically impossible. I have no idea what implications this has for the theory itself.

However, someone said to me once that he could derive relativity mathematically, and that he didn’t require the cartoon or the thought experiment to help him. To him I would say simply that, while it may be possible to derive relativity mathematically, everyone knows that mathematics is a human invention, and must at times give way before ruthless empirical reality. Indeed, it seems as if one of the only ways to ‘see’ relativity is in the mind’s eye of mathematics. Ask any logical empiricist: the mind’s eye is not an objective source of empirical data.
This is the sort of 'deer caught in the headlights look that I usually get when confronted with a similar array of mathematical notation
All of this might make us a wee bit skeptical of Einstein’s conclusion that time is relative. [It does me, as you can imagine.] And, after a century in Einstein’s sway, it might convince us, once again, to entertain the intuitively satisfying notion that time can neither be speeded up, nor slowed down. This view of time accords much better with the anthropological insight that the consciousness of time is peculiar to humans, and that it is a cultural construct. So much for space-time. Perhaps Time exists, not as Einstein so famously said, so that everything in the universe doesn’t happen at once, but because we humans need to communicate our perception of sequential events in nature that span intervals which are meaningful only to us.

* A. Einstein, Zur Elektrodynamik bewegter Körper. Annalen der Physik 17, 891–921 (1905).

** The length of a “football” field (or pitch) is relative to the side of the Atlantic on which the English-speaking reader resides, and varies according to the rules set by the respective governing bodies of the “football” played there. On the east (or right) side of the Atlantic, the distance is 90–120 m (FIFA). On the west (or left) side of the Atlantic, the distance further depends upon which side of the Canada–U.S. border one lives. South of that line the distance is 91.44 m (NFL). North of the line, the distance is set at 100.584 m (CFL). 

*** Which, curiously enough, was an attempt to understand the universe in terms of the aether through which, up until Einstein's time, was theorized to have been the 'medium' through which electromagnetic energy moved (much like sound waves through air and water waves through, well, water).

Saturday, 17 March 2012

Just a Bit Un-Reflexive. Don't You Think? American Archaeologists Invade Iraq

First, the oil. Next, the goodies. Perhaps archaeologists should know when their actions smack of good, old-fashioned colonialism, never mind that neo stuff. Talk about four fields anthropology! Whatever happened to the anthropological perspective? From SCI-TECH TODAY.com comes this reminder that we really haven't come very far in the last century.
[Note that in this post I do not repeat the name of the archaeologist quoted in this article, which I refuse to use 'cause she might think I'm singling her out for abuse, when I'm really aiming at the establishment and their lackeys! Oh, yeah, you're right. I'm singling her out, just the same. Names or no names, if the shoe fits, wear it. Better to just give the NSF or whoever's grant money to the Iraqi archaeologists and support that country's recovery from an illegal war that decimated its infrastructure, slaughtered hundreds of thousands, demoralized the people, allowed the rich history to be plundered, etc., etc. By the way, this next bit's in red, 'cause it might as well have been written in blood.]
"As the last U.S. convoy was leaving (on Dec. 17), we were headed into Iraq..."
For an encore, [name redacted to protect the guilty] and her colleagues have requested permission to investigate Ur itself, first heavily excavated by British archaeologist Charles Leonard Woolley eight decades ago [emphasis added]. Discovery of the intact tomb of "Queen" Puabi, buried with 52 poisoned attendants and gold adornments, made Ur world famous.
"Woolley was very good for his time, but he didn't have the methods we have now..."
I guess not! But is she talking about trowels and sieves? Or does she mean smart bombs? Water-boarding? Psy-ops? Sexual humiliation? Extraordinary rendition? The Green Zone?
     Leave the land previously known as Mesopotamia to the people--Kurd, Sunni or Shi'a! Jeebuz Criminy! Somebody please remind me we're in the twenty-first century!

Wednesday, 14 March 2012

Turkey Tales

Another comment from Anonymous showed up this morning, pertaining to my diatribe from yesterday on the Combe Grenal eagle talon that putatively gave evidence that the Neanderthals were just like you and me. You'll remember that it concerns an article published in PLoS ONE the other day: ''Presumed Symbolic Use of Diurnal Raptors by Neanderthals,' by Eugène Morin and Véronique Laroulandie.' I reproduce it here, verbatim, because it will serve as a spring-board to a level of scrutiny that I'd previously hoped to avoid.
Thanks for the food for thought. The idea of raptors inflicting this damage themselves motivated me to ask a friend who's been a falconer for more than 40 years about this, and his response was this: "never have I seen a raptor groom off it's sheath, never mind damage the bone beneath." He's kept eagles, hawks, and falcons of all sizes and sorts. Until there's some experimental work done on this, we can't support or deny this, but his observations are worthwhile. 
To begin with, who said anything about 'grooming off its sheath?' Not I! And, let's face it, there's no evidence that these 'cut marks' occur where one would want to cut to remove the sheath. As I will demonstrate below, to do so would have required cutting distal to the point from the keratin sheath grows. The 'cut marks' are evidently proximal to the anatomical location of sheath growth. 
     To demonstrate, I've scooped up a replica of an osteological specimen that retains the talon sheath, to see where the sheath grows from. Although these are clearly different species, the embryological origin of the anatomical structures would be very similar, if not identical [even if I don't presently have the data to support the contention].  As you can see, the sheath on the Combe Grenal talon on the right would not have grown onto the process of bone within the black square that the authors have superimposed to indicate the portion of the specimen where the putative cut marks are located.
From Morin et Laroulandie (2012)

Credit Skulls Unlimited

     Thus, the falconer and anonymous are introducing a red herring by suggesting that no bird ever groomed off its sheath. In the case of the Combe Grenal avian phalanx the presumed cut marks occur where there would have been skin, not sheath. 
     Unless our falconer was in the habit of defleshing his dead birds, I doubt whether he would have seen the microscopic marks that the authors have discovered on what would have been the skin-covered portion of the talon. And even if he had, I doubt that he could have seen such marks without a powerful hand lens or a stereo-microscope such as the one the authors used to examine this specimen. 
     So, our falconer has proven to be of no relevance in our search for a process to explain these cut marks. The authors can assert, all they want, that the marks they observe would have been inflicted in the course of some symbolically mediated activity. However, if the gifted Neanderthal had wanted to remove the sheath, it might have been a better aim--these scratches would have done nothing to impinge on the keratinous covering.
     Now let's have a look at the cut marks themselves. Close inspection of the authors' main figure reveals that the inset closeup of the marks doesn't accord with the photo of the whole specimen. In the illustration below, notice the dark furrow indicated by the red arrows. Now look at the closeup, C. Where is the dark furrow? Clearly the talon sheath wouldn't have grown above the dark furrow. Since the dark furrow doesn't show up distal to the incisions in C, I'm calling this further evidence that the so-called cut marks are not in the proper place if the intention was to remove the sheath. Moreover, the marks are so vanishingly tiny and so clearly V-shaped that they could have been inflicted only by a very thin, very hard, sharp object, such as a steel knife, or the teeth of a struggling rodent, a rock outcrop [perhaps even a flint outcropping!], or similar. I'm unaware of any claims for iron metallurgy in the Middle Palaeolithic [although, god knows, they're probably out there!]. Nevertheless, I imagine that these great birds got around as much as they do in the present, and thus it's not out of the question that such damage could occur naturally, and perhaps frequently, without our or the falconer's knowing. 


Please don't think I'm suggesting that someone embark on a set of actualistic studies of dead raptor talons in support of more robust inference making with regard to these scratches. I find the whole idea silly, just as I find the notion that such a behaviour, if it had occurred meant anything at all about the Neanderthal capacity for symbolic thought. I still don't see why PLoS ONE saw fit to publish this so-called research.
     Thanks to Anonymous for prodding me to take an even closer look at this silly claim. Call me unkind for calling it silly. Call me vicious, if you want. But at least you can't call me stupid, naïve, or gullible.

Tuesday, 13 March 2012

Announcing World Domination, One Web Page at at Time

I don't know what it means in the grand scheme of things, but I've been thinking that it might be better to separate my 'public' from my private life by means of a stand-alone facebook page for the mega-popular Subversive Archaeologist. All those in favour, go there and 'Like' it! And it'll like you back! Honest. Cross my heart. No lie. For reals.

Monday, 12 March 2012

Mirthful Monday: Combe Grenal Eagle Talon is a Turkey

Combe Grenal (tip o' the hat to Don's Maps)
Chalk up another one for PLoS ONE ! From John Hawks's blog today comes news of this hair-raisingly un-empirical beauty, 'Presumed Symbolic Use of Diurnal Raptors by Neanderthals,' by Eugène Morin and Véronique Laroulandie. This is another one of those rapid publications that might have benefitted from a bit more time with the referees. The authors claim that one of the Combe Grenal (Dordogne, France) Neanderthals purposefully removed the talon sheath from an eagle's terminal phalanx to aid in some sort of symbolic activity.
     [!]
Cheers, once again, to Don's Maps! Click to embiggen and look for Combe Grenal in the cluster of red dots in southern central France.
A single [that's right, one only] eagle third phalanx from Combe Grenal, in France [Harold, are you responsible for this unique and important discovery?] is described in this manner:
This well-preserved specimen bears on its proximo-dorsal side two incisions produced by a stone tool. These incisions closely coincide with the proximal margin of the keratinous sheath overlying the terminal phalanx of the digit, which suggests removal of the claw sheath ... . The absence of other parts of raptors in this layer and the fact that bird claws are predominantly made of a tough fibrous protein called β-keratin ... point to a non-alimentary use of an eagle claw.
And here is the magnificent specimen, direct to you from PLoS ONE.
The scale bar refers to the complete phalanx and is 1 cm long. That puts the cutmark at about .25 mm across. Clearly, something extremely sharp made these incisions. Unfortunately the illustration isn't of sufficient resolution to make out any diagnostic stone-tool marks. So, we are left to take the authors' word for it that these are unequivocally the marks of a hominid-wielded stone chip. I strongly doubt their inference, and not just because these are such tiny cuts. Oh, the authors do report two other similar occurrences from Les Fieux. Same story. Same conclusions.
     I know it seems pedantic of me, but would it have been too hard to examine a few living raptor terminal phalanges to see if, perhaps, the habitual grooming behaviours of such animals could leave such marks, on occasion, near the proximal margin of the keratinous sheath? Raptors use their beaks for grooming, and a beak is a rather sturdy structure, and, one would have thought, a very sharp implement that could, no doubt, quite easily put a scratch or two in the relatively porous bone of the terminal phalanx, which almost always gets bits of brain and fur stuck to it during the raptoring part of a raptor's feeding episode. 
     I'm not sure that a geneticist could be expected to act as an authoritative critic of difficult to interpret archaeological materials, so I'll let John Hawks off the hook on this one. However, he does go further and cites another study of modification to bird bone, published in PNAS [another turkey with Erik Trinkaus acting as editor], in which the authors aver hominid modification, again in the form of fine, v-shaped incisions on the bone [more often the effect of carnivore or other attentions], and other markings that convinced them of some form of symbolically mediated preference for wing bones. If anyone wants me to take that one apart, I will. But first, I want someone with an ounce of sense to start refereeing these turkeys.  
     These authors have provided us with not a whiff of empirical evidence beyond an assertion that these marks had to have been made by stone tools. Give me a[nother] break.
     I hope I needn't go further into the matter and discuss the merits of their fanciful conclusion based on the weakly supported inference that these were stone tool cutmarks. I'll spare you my feelings on their assertions of symbolism.

Sunday, 11 March 2012

Why No 14C Dates for Blombos Cave's MSA? Not Lobbing Aspersions. Just Sayin' ...

If I were a chess player I probably wouldn't be trying this gambit. But I'm not. So, bear with me.
Worked bone, stone and ochre from Blombos Cave (Wikipedia)
I need help with a question that's been nagging me for years. I'm curious to know if any of the bone artifacts from the Middle Stone Age (MSA) layers at Blombos Cave have been directly dated using AMS 14C. From my reading it appears not. In fact, it looks as if 14C was abandoned in favour of luminescence techniques once they had excavated deeper than those layers identified as Late Stone Age, the earliest of which were dated to give-or-take 39 ka BP. Everything below that is deemed to be MSA, and organic materials such as charcoal and bone were passed over in favour of grains of sand or burned flints in those strata. 
     Remember that, for most of us, MSA is synonymous with the Middle Palaeolithic in the rest of the world, and, for better or worse, it's exclusively associated with the Neanderthals and their ilk, for which the jury is still out as to their cognitive equivalency with us modern types. Finding what are clearly modern human artifacts at Blombas Cave and elsewhere on the order of 30 to 50 ka earlier than anywhere else in the world has stunned and amazed scientists from Barrow to Burbank. But it's never sat well with me. 
Location of Blombos Cave, South Africa (Credit)
     You and I know that 14C is perfectly capable of accurately gauging the age of organic materials until at least 50 ka, notwithstanding the need for calibration that corrects for environmental  and other effects. Why then do we have only luminescence age determinations for the sediments in which the Blombos Cave bone artifacts and charcoal were deposited? Henshilwood et al. (2002) provide some insight, although I'm not certain they realize that by doing so they've left themselves exposed to (at a minimum) questions about their decision.
     That (to me) curious decision is explained in what amounts to a throwaway comment, which I will quote here

In radiocarbon terms, the MSA at BBC is of infinite age (Vogel, personal communication).
The MSA levels are being dated using luminescence techniques: single-grain laser luminescence (SGLL), single aliquot optically stimulated luminescence (OSL and IRSL), multiple aliquot OSL on sediments and also TL of burnt lithics and electron spin resonance (ESR) of teeth (Henshilwood et al. 2002:638). 
Being the skeptical type, I was intrigued by what seemed to me to be such a weak citation as to the inefficacy of 14C beyond 39 ka--the date of the oldest LSA at Blombos. Just a 'Vogel pers. comm.' No reams of empirical evidence. No other justification. So I endeavoured to discover by what authority Vogel had made such a pronouncement.
     No doubt some among you will think it naïve of me, or worse, that I'm poorly informed and ill-prepared to be taken seriously by the palaeoanthropological establishment. Nevertheless, I had no prior knowledge of Vogel's reputation in the radiocarbon world and in the South African archaeological community. His work includes a 1997 paper in Radiocarbon in which he attempted to calibrate 14C dates with U/Th dates within the same stalagmite from a South African cave. That work was superseded a few years later by the more widely cited Fairbanks et al. (2005), who honed the 14C calibration curve back to 50 ka using pristine corals from around the globe. Their findings are that 14C underestimates calendar years such that 45 RCYBP works out to 48,934 calendar years (give or take 500). By this means the calendar date of 39,200 BP from the lowest LSA level at Blombos would have been produced by a radiocarbon age of about 34 ka RCYBP (try it yourself by clicking here to go to Fairbanks's calibration calculator).
     It would seem, therefore, that despite Vogel's pronouncement, cited in Henshilwood et al. (2002), there is no physical limitation on dating organic material that is older than 39 ka (i.e 34,000 RCYBP), as long as its age doesn't exceed 45 RCYBP. That would, theoretically, allow the excavator of Blombos Cave to extend use of 14C for at least a further 11,000 RCY beyond the earliest LSA from Blombos Cave. Surely some of the MSA materials could be presumed to date from this 11,000-year window.
     Directly dating the bone and charcoal from the upper MSA strata may prove nothing. However, knowing that there's no theoretical limit on the use of 14C for that 11,000 year period leaves wide open the question as to why Chris Henshilwood hasn't attempted to date, directly, some of his bone artifacts or charcoal from the MSA layers. It's just possible that they would yield dates far younger than those produced by luminescence techniques (about which I've had a certain amount to say in previous efforts here at the Subversive Archaeologist).
From Henshilwood et al. (2002)
     After all, the MSA levels in Blombos (at least those illustrated in Henshilwood et al. [2002], shown above) appear to be quite shallow--the sort of depth that you could imagine accumulating in far less than the 35 ka that the luminescence dates would have you believe it took for them to accumulate--and it's within reason to suspect that they could easily have been deposited in the 11 RCY or so before 39,200 BP.
     And so. For what it's worth, I'm issuing a challenge from this lofty perch of mine. Chris, try dating some of your MSA bone using good, old-fashioned, AMS 14C and see what you get. A fair few of us are curious to know the result. 

Saturday, 10 March 2012

NewsSpotting Again! Gorilla My Dreams

Oh, I want a girl just like the girl that married Dear Ol' Dad! 
(Credit: San Diego Zoo Safari Park)
I can't resist drawing your attention to the news that now the gorilla genome has been sequenced. Elizabeth Pennisi writes in Science Now, 'A Little Gorilla in Us All.'

Word is that we're more like gorillas in some ways than we are like chimps, even if we are more like chimps and bonobos, overall, than we are like these wonderful creatures. Make of that what you will, the achievement is, still, quite awe inspiring.

Be Vewwy, Vewwy, Quiet. I'm Hunting Edits

I have a big editing job to do. Shhhhh. I'll be pretty busy for a day or so. See you when I come up for air.

Wednesday, 7 March 2012

Leading Lam to the Laughter--Her Teachers, Not Me!


From the AAAS web site

There is no way I'm gonna let this go without a response. Mimi Lam, graduate student at one of my alma maters, UBC, has been much in the news of late. She purports to argue that the 'handaxe' was the 'first commodity,' and that its persistence in the archaeological record was due to meaning with which it came to be imbued, including what she calls 'cultural power,' whatever that is. Lam's brand of scholarship is the kind that makes me squirm in my seat when I hear it in person, and makes me scream out loud when I'm reading it in private. It's at least twice removed from the reality of actual Lower and Middle Palaeolithic scholarship, and makes so many perilous leaps of faith as to take one's breath away.
     Although I 'published' this a few minutes ago, it occurred to me as I was walking away that none of this is, strictly speaking, Ms. Lam's fault. In fact, this is about the best 'object lesson' I could think of to illustrate what the Subversive Archaeologist is all about. Ms. Lam bases her entire argument on 'knowledge' that she, no doubt, received from her anthropology instructors--perhaps even her own advisor. That 'knowledge' has led Ms. Lam to construct an elaborate argument that, were any of its assumptions in reality true, she may (maybe) have a cogent argument. I'm talking about the notion that there is a thing called a handaxe that was the desired end-product of a lithic reduction sequence performed by, in the earliest times, by Homo erectus (and variations thereof), and later by Homo neanderthalensis. She would also have been taught, unequivocally, that the 'handaxe' was the product of a mental template in the mind of the maker. That mind may well have been capable of language, and 'culture' and so on.
     So, in what follows, please keep in mind that, but for the occasional grammatical issue (which can easily be forgiven in a conference abstract), the problems that I point out in Ms. Lam's presentation are, on the whole, the fault of her teachers and mentors, and those whose work she has read, and that she is in no way culpable for having constructed this argument on that basis. As such, her paper is an example of just what happens if you let your anthropology babies grow up to be carbon copies of people who peddle conventional wisdom using hackneyed course notes and stenographic repetition of mantra-like presumptions about the hominid past.
     So far the only published version of her oral presentation at the 2012 AAAS Annual Meeting is the abstract that's published on the conference web page. I think it contains quite enough grist for my mill. Enough, at least, to obviate the need to read any longer version that might (improbable though it seems to me) make it into print. Lam's words are shown in white. Mine will be subversive yellow. In her own words: 
I argue that the ability to build
'Build' is not an accurate description of making a stone artifact. It is a subtractive process, and can in no way be considered a 'construction' or something that is 'built.'
portable,
This is hardly a useful addition to this sentence, given that portability was not a question for about a million and a half years. For portability to make any sense at all there must be an alternative, and as far as we know, with the possible exception of nests made in trees or refuges that were used repeatedly, there was no alternative to portability.
durable
The durability is likewise of no relevance here--we're talking about 'tools' made of stone, not, presumably, the ability to choose between making a 'handaxe' out of stone vs. malleable or perishable material. 
artefacts may trace the evolution of human cognition.
This almost goes without saying. 
Hominins evolved a complex suite
In the whole history of Hominidae (or Hominini, if you prefer) this may be true. But for the first 2 million years the 'complexity' of stone tool making amounted to smacking one rock against another such that a sharp chip was removed. True 'complexity' came much later, and depending on whom you canvass, might have occurred during the tenure of Neanderthals and their contemporaries, or not until the advent of modern humans. 
of stone tools, which reflected both emerging individual cognition
I am at a loss to know what is meant by 'individual' cognition? I was unaware of any other kind. 
and embodied knowledge.
To say that 'knowledge' was involved in this acitivity is to presume a level of cognitive ability on a par with our own, which is by no means a secure position. 
The manufacture of robust,
again, meaningless against a backdrop of stone, stone, and more stone.
standardized
This is where Lam starts to stray into areas she ought not to have strayed into. While it's true that many, many archaeoalogists have historically held to the idea that the handaxe represented a 'standardized' form, there is by no means a consensus on the question, and indeed very good reason to doubt the assertion. Those who've been visiting the Subversive Archaeologist since its inception will remember this, and this, and this, where I attempt to take apart the idea that there is anything standard, or even purposeful, about the shapes of bifacial cores in the archaeological record until the Upper Palaeolithic.
artefacts may have enabled their trade
Creation of whatever is implied by the 'handaxe' 'may' also have enabled their use as door stops, but that would entail the further inference that doors existed in those times--hardly a reasonable assertion. To say that there would or could have been 'trade' at that time depth is speculative, at best, and cannot be considered a serious hypothesis, much less a well-warranted assumption. Also, whether or not the artifacts were robust or standardized they would nonetheless be tradable. So, this statement is either really ambiguous or truly nonsensical. You choose.
and imbued them,
This is purely a pedantic observation. As the sentence is constructed, the 'manufacture imbued the biface with cultural meaning.' I think not.  
over time, with cultural meaning within hominin social groups.
Again, the 'meaning' of an object or of its manufacture is something that you and I are quite able to grasp, because we give meaning to everything in our world. Whether or not previous hominid forms were capable of having this conversation, and thus able to give an arbitrary meaning or meanings to a given object is still a very open question.
Here, the longevity,
The persistence through time and space of the 'handaxe' can be explained by hominid cognition or the absence thereof, and is therefore a non-question. 
ubiquity,
The wide distribution of 'handaxes' can only be explained by the ubiquity of the hominids, and not the intrinsic or semiotic attributes of the things themselves. 
durability,
The preservation of handaxes cannot be explained by anything other than that they are made of extremely durable material. It's a crying shame that all of the wood handaxes didn't preserve. But that's the archaeological record for you! 
and stability in design
There is nothing stable about the 'design' of a handaxe other than the stability of the paradigm with which archaeologists have viewed them through time and in distinguishing the 'handaxe' from other bifacially flaked pieces that do not conform to the mental template in the mind of those archaeologists. 
of Acheulean handaxes is explained by viewing handaxe construction in three temporal phases, co-evolving with the human niche:
More pedantry, I'm afraid. I fail to see how 'handaxe construction' could 'co-evolve' with the 'human niche.' According to Lam the construction and shape both remained static over immense time and space, and in fact did not 'evolve.' 
first, as iconic
To label the 'handaxe' 'iconic' presumes that their hominid creators had the cognitive ability to 'see' such things as other than lumps of hard stuff. 
multipurpose
The function of the handaxe has always been and continues to be the subject of endless speculation, and it is by no means a certainty that it had any purpose beyond that of a source for sharp chips of stone with which to cut or scrape. 
functional tools,
I believe this is a straightforward redundancy. 
fashioned by ancestral hominins
Finally a well-warranted assumption! 
; second, as standard
Ditto on the whole 'standard' thing. 
indexical commodities
Again, this presumes the hominids that left the 'handaxes' had the ability to conceive of such things, which is by no means a given, especially amongst what Lam goes on to label as 'pre-linguistic' hominids. 
exchanged in social relationships, perhaps as a paleocurrency among pre-linguistic hominins; and third, as symbolic of cultural power, carried and exchanged as gifts by modern humans within socially constructed niches, now filled with shared meanings and language.
I hope that Lam is describing the circumstances that would surround, say, one of my friends getting a hold of a real archaeological 'handaxe,' knowing that I'm a palaeoanthropologist and archaeologist, and giving it to me as a gift, whereupon I would display it on my mantle. Ascribing such behaviour to any 'hominid' other than cognitively modern humans is a huge leap of faith. In fact, the entire abstract, and no doubt the entire paper is no more than a giant leap of faith. In no way does it represent the level of scholarship, even, in the matters about which it claims to have something to say.