Ethiopian News, Current Affairs and Opinion Forum
Posts: 3048
Joined: 15 Feb 2013, 18:28

Haunt Jemima

Post by my2cents » 07 Feb 2015, 05:35

Becoming Obama

“Then we carefully disguise the bribes as legal fees by changing the word ‘bribes’ to ‘legal fees.’ ”

FEBRUARY 5, 2015
Who Will Choose Nigeria’s Next President?

Sixteen years after the end of military rule, many Nigerians are looking forward to the February 14th elections, which will extend their country’s longest stretch of uninterrupted democracy. But the prospect of violence tinges the anticipation, and for some people both candidates are deeply worrisome. “This is the most confusing juncture in Nigeria’s history,” my friend Ojiugo said.

Do we cast our ballots for an incumbent whose five years at the top have been characterized by the inability to combat Boko Haram, the bungling of the Chibok girls’ rescue, and a failure to take action against people and corporations accused of widespread corruption? Or do we vote for a former military general whose twenty-month rule was infamous for a clampdown on freedom of speech and other gross human-rights violations?

President Goodluck Jonathan, of the People’s Democratic Party, and General Muhammadu Buhari, of the All Progressives Congress, are the main contenders. Jonathan, who rose to power after the death of President Umaru Yar’adua, in 2010, is a career politician from the conservative P.D.P., and the first President to come from a minority tribe. He defeated Buhari in the 2011 election, which was described as the country’s freest and fairest since its return to democracy in 1999. Back in 1983, Buhari brought Nigeria’s then fledgling democracy to an abrupt end via a military coup. Less than two years later, his government was toppled in another coup. A survey conducted by Nigeria’s ANAP Foundation, a non-profit organization that promotes good governance, shows that the two men are currently running neck and neck.

“This election will be decided by swing voters,” Atedo Peterside, the president and founder of ANAP Foundation, wrote in a January ThisDay article. Many of these voters are upset, he noted. “They feel that our two major political parties have ‘cheated’ them by forcing them to choose between the devil and the deep blue sea.”

But there are many other Nigerians who know exactly for whom they will vote. Another survey, conducted by Afrobarometer, a pan-African non-partisan research network, shows that forty-two per cent of Nigerians say that they will definitely vote for Jonathan, while another forty-two per cent say that they are firmly for Buhari. Their choice of candidate probably has little to do with track records or antecedent.

There are those who will vote for Buhari simply because he is not Jonathan. For them, change is the word of the hour, and they would happily exchange Jonathan for a Halloween pumpkin if that were the only alternative available. “Anybody but Jonathan” has become a popular refrain in conversations about the election.

There are those who will vote for Jonathan because they have benefitted from his government. Their very subsistence depends on his remaining in power. I have yet to meet a single person with close relatives or other strong connections at the top who thinks that Nigeria needs anything other than a Jonathan Presidency. “He’s doing a lot of good work,” they insist, as the national wealth trickles in their direction. They are not particularly concerned with how everyone else is faring and, as they might say, they have no guarantee that a different President would make things better for everyone. “Goodluck Nigeria!” some of their advertorials in the daily newspapers read.

There are those who will vote for Buhari because they consider him more capable of handling Boko Haram. Not only has Jonathan demonstrated a frightening ineptitude at containing the terrorist group over the past four years but Buhari has a military background. “Buhari will personally lead the fight against Boko Haram,” Yemi Osinbajo, a law professor and Buhari’s running mate, said recently. The retired general led Nigerian troops against seceding Biafra during the civil war of 1967 to 1970, and against the Maitatsine religious insurgents in parts of northern Nigeria in the early nineteen-eighties. Buhari himself is from the north, where every domestic Boko Haram attack has taken place, and is unlikely to be apathetic about the fate of the region. During a media event in January, 2014, President Jonathan boasted that his government had pushed Boko Haram to the “fringes” of Nigeria. “Are the people of the northeast prepared to accept that they have been abandoned by their leaders, and therefore need to wake up and take their destiny in their hands?” Garba Shehu, a political analyst, wrote in a November, 2014, Premium Times column.

There are those who will vote for Jonathan precisely because of his struggle to manage the Boko Haram crisis. They believe the conspiracy theory, popular in certain ethnic circles, that the insecurity in Nigeria is orchestrated by forces determined to make the country ungovernable for its first minority President. By voting for Jonathan, they hope to show the unknown forces that their evil machinations have come to naught. “Boko Haram is the handiwork of those who promised to make the country ungovernable if Jonathan dared to compete in the election,” they say, referring to the displeasure expressed by certain northern politicians in 2012, when Jonathan broke a gentlemen’s agreement not to run for a second term and to hand over party leadership to the north.

There are those who will vote based on clan sentiments. In the three previous Presidential elections, all of which Buhari lost, he gained the greatest number of votes in his native northwest region. Crowds of fervent supporters in that part of the country have jammed stadiums and spilled into nearby streets at his rallies. The same goes for Jonathan in the oil-producing Niger Delta region. Many there would rather see their “brother” at the top than anyone else. After all, the crude oil that lubricates the national treasury “belongs” to them. “If they take power, we will demand for all the years of their benefitting from oil,” Victor Ben, an ex-militant, said during a meeting of Niger Delta youth on January 25th.

These clan loyalists are often willing not only to cast ballots but also to shed blood. “You don’t have any valid visa on your passport?” some of my loved ones have exclaimed, concerned that I won’t be able to flee to the West if violence erupts after February 14th. About eight hundred Nigerians lost their lives in the wake of the 2011 elections, when riots broke out in the north following Buhari’s loss to Jonathan. Last week, at the Niger Delta youth meeting, some ex-militants declared that they would “go to war” if Jonathan’s reëlection bid fails. “Every Niger Delta youth should go and prepare!” Asari Dokubo, an ex-militant, said. A friend who is travelling to the United States to give birth to her first child has decided to take an earlier leave from work and depart Nigeria two days before the elections. “In case they close all the airports afterwards,” she said.

Last edited by my2cents on 07 Feb 2015, 06:20, edited 1 time in total.

Posts: 3048
Joined: 15 Feb 2013, 18:28

Re: Haunt Jemima

Post by my2cents » 07 Feb 2015, 05:45

The Tragedy of the American Military


The New Yorker | Letter from Sudan

The Moderate Martyr
A radically peaceful vision of Islam.


In 1967, a law student at the University of Khartoum named Abdullahi Ahmed an-Naim was looking for a way to spend a summer evening in his home town, a railway junction on the banks of the Nile in northern Sudan. No good movies were showing at the local cinemas, so he went with a friend to hear a public lecture by Mahmoud Muhammad Taha, an unorthodox Sudanese mystic with a small but ardent following. Taha’s subject, “An Islamic Constitution: Yes and No,” tantalized Naim. In the years after Sudan became independent, in 1956, the role of Islam in the state was fiercely debated by traditional Sufists, secular Marxists, and the increasingly powerful Islamists of the Muslim Brotherhood, who, at the time, were led in Sudan by Hasan al-Turabi, a legal scholar. Politically, Naim was drifting toward the left, but his upbringing in a conservative Muslim home had formed him. “I was very torn,” Naim recently recalled. “I am a Muslim, but I couldn’t accept Sharia”—Islamic law. “I studied Sharia and I knew what it said. I couldn’t see how Sudan could be viable without women being full citizens and without non-Muslims being full citizens. I’m a Muslim, but I couldn’t live with this view of Islam.”

Naim’s quandary over Islam was an intensely personal conflict—he called it a “deadlock.” What he heard at Taha’s lecture resolved it. Taha said that the Sudanese constitution needed to be reformed, in order to reconcile “the individual’s need for absolute freedom with the community’s need for total social justice.” This political ideal, he argued, could be best achieved not through Marxism or liberalism but through Islam—that is, Islam in its original, uncorrupted form, in which women and people of other faiths were accorded equal status. As Naim listened, a profound sense of peace washed over him; he joined Taha’s movement, which came to be known as the Republican Brothers, and the night that had begun so idly changed his life.

It is a revelation story, and some version of it is surprisingly easy to hear in the Islamic world, especially among educated middle-class Muslims in the generation that came after the failures of nationalism and Socialism. During a recent trip to Sudan, I visited the University of Khartoum, which is housed in a collection of mostly colonial-era, earth-colored brick buildings in the city center, where I met a woman named Suhair Osman, who was doing graduate work in statistics. In 1993, at the age of eighteen, she spent the year between high school and college in her parents’ house on the Blue Nile, south of Khartoum, asking herself theological questions. As a schoolgirl, she had been taught that sinners would be eternally tormented after death; she couldn’t help feeling sorry for them, but she didn’t dare speak about it in class. Would all of creation simply end either in fire or in Paradise? Was her worth as a woman really no more than a quarter that of a man, as she felt Islamic law implied by granting men the right to take four wives? Did believers really have a duty to kill infidels? One day, Osman took a book by Taha off her father’s shelf, “The Koran, Mustapha Mahmoud, and Modern Understanding,” published in 1970. By the time she finished it, she was weeping. For the first time, she felt that religion had accorded her fully equal status. “Inside this thinking, I’m a human being,” she said. “Outside this thinking, I’m not.” It was as if she had been asleep all her life and had suddenly woken up: the air, the taste of water, food, even the smell of things changed. She felt as if she were walking a little off the ground.

The quest for spiritual meaning is typically a personal matter in the West. In the Islamic world, it often leads the seeker into some kind of collective action, informed by utopian aspiration, that admits no distinction between proselytizing, social reform, and politics. The Islamic revival of the past several decades is the history of millions of revelation stories. Far from being idiosyncratic or marginal, they have combined into a tremendous surge that is now a full-time concern of the West. Renewal and reform—in Arabic, tajdid and islah—have an ambiguous and contested meaning in the Islamic world. They signify a stripping away of accumulated misreadings and wrong or lapsed practices, as in the Protestant Reformation, and a return to the founding texts of the Koran and the Sunna—guidelines based on the recorded words and deeds of the Prophet. But, beyond that, what is the nature of the reform? The father of one modern revelation story is Sayyid Qutb, the Egyptian religious thinker who, after advocating jihad and the overthrow of secular Arab regimes, was hanged by Gamal Abdel Nasser in 1966. Qutb’s prison writings reject modernity, with its unholy secularism, and call on adherents of Islam to return to a radically purified version of the religion, which was established in the seventh century. Among the idealistic young believers who found in his books a guide to worldwide Islamization were Ayman al-Zawahiri and Osama bin Laden. With the newest generation of jihadis—Qutb’s spiritual grandchildren—the ideas of the master have been construed as a justification for killing just about anyone in the name of reviving the days of the Prophet; earlier this year, several Baghdad falafel venders were killed by Islamists because falafel did not exist in the seventh century.

Mahmoud Muhammad Taha is the anti-Qutb. Taha, like Qutb, was hanged by an Arab dictatorship; he was executed, in 1985, for sedition and apostasy, after protesting the imposition of Sharia in Sudan by President Jaafar al-Nimeiri. In death, Taha became something rare in contemporary Islam: a moderate martyr. His method of reconciling Muslim belief with twentieth-century values was, in its way, every bit as revolutionary as the contrary vision of Qutb. It is one sign of the current state of the struggle over Islam that, in the five years since September 11th, millions of people around the world have learned the name Sayyid Qutb while Mahmoud Muhammad Taha’s is virtually unknown. Islamism has taken on the frightening and faceless aspect of the masked jihadi, the full-length veil, the religious militia, the blurred figure in a security video, the messianic head of state, the anti-American mob. At Islam’s core, in the countries of the Middle East from Egypt to Iran, tajdid and islah have helped push societies toward extremes of fervor, repression, and violence. But on the periphery, from Senegal to Indonesia—where the vast majority of Muslims live—Islamic reform comes in more varieties than most Westerners imagine. At the edges, the influence of American policy and the Israeli-Palestinian siege is less overwhelming, and it is easier to see that the real drama in Islam is the essential dilemma addressed by Taha: how to revive ancient sacred texts in a way that allows one to live in the modern world.

Taha was born sometime early in the twentieth century—scholars say 1909 or 1911—in a town on the eastern bank of the Blue Nile, two hours south of Khartoum, called Rufaa. It is a somnolent, heat-drenched town, one of those silent places—they stretch from one harsh end to the other of the North African region known as the Sahel—where mystical movements often begin. In the years before Sudan’s independence, Taha was educated as a civil engineer in a British-run university, and after working briefly for Sudan Railways he started his own engineering business. He absorbed modern political and social ideas by reading widely, if incompletely, in the works of Marx, Lenin, Russell, Shaw, and Wells. In 1945, he founded an anti-monarchical political group, the Republican Party, and was twice imprisoned by the British authorities: first for writing pro-independence pamphlets, and then for leading an uprising in Rufaa against the arrest of a local woman who had subjected her daughter to a particularly severe form of female circumcision. (Taha opposed the practice but believed that the colonial edict banning it would only make it more widespread.) His second imprisonment lasted two years, and when he was released, in 1948, he entered a period of seclusion, prayer, and fasting in a small mud building in the courtyard next to his in-laws’ house. By the time I visited Rufaa, in July, the hut had been torn down and replaced, and the house was occupied by a family of southern Sudanese.

While in seclusion, Taha spoke to few people; one man described him as having long, unruly hair and bloodshot eyes. His wife brought him plates of simple food—her family urged her to divorce this formerly successful professional, who some people thought had gone mad, but she refused—and he left the hut only to take swims in the Nile, a short walk away. During this period, which lasted three years, Taha developed his radically new vision of the meaning of the Koran. After emerging from seclusion, in 1951, he dedicated the rest of his life to teaching it.

For any Muslim who believes in universal human rights, tolerance, equality, freedom, and democracy, the Koran presents an apparently insoluble problem. Some of its verses carry commands that violate a modern person’s sense of morality. The Koran accepts slavery. The Koran appoints men to be “the protectors and maintainers of women,” to whom women owe obedience; if disobeyed, men have the duty first to warn them, then to deny them sex, and finally to “beat them (lightly).” The Koran orders believers to wait until the holy months are finished, and then to “fight and slay the Pagans wherever you find them, and seize them, beleaguer them, and lie in wait for them in every stratagem (of war).” These and other verses present God’s purpose in clear, unmistakable terms, and they have become some of the favorite passages in the sermons, fatwas, and Internet postings of present-day fundamentalists to justify violence and jihad. An enormous industry of reform-minded interpreters has arisen in recent years to explain them away, contextualize them, downplay them, or simply ignore them, often quoting the well-known verse that says there is “no compulsion in religion.” Not long ago, I received one such lecture from a Shiite cleric in Baghdad, who cited the “no compulsion” verse while sitting under a portrait of Ayatollah Khomeini. In confronting the troublesome verses head on, Taha showed more intellectual honesty than all the Islamic scholars, community leaders, and world statesmen who think that they have solved the problem by flatly declaring Islam to be a religion of peace.

The Koran was revealed to Muhammad in two phases—first in Mecca, where for thirteen years he and his followers were a besieged minority, and then in Medina, where the Prophet established Islamic rule in a city filled with Jews and pagans. The Meccan verses are addressed, through Muhammad, to humanity in general, and are suffused with a spirit of freedom and equality; according to Taha, they present Islam in its perfect form, as the Prophet lived it, through exhortation rather than threat. In Taha’s most important book, a slender volume called “The Second Message of Islam” (published in 1967, with the dedication “To humanity!”), he writes that the lives of the “early Muslims” in Mecca “were the supreme expression of their religion and consisted of sincere worship, kindness, and peaceful coexistence with all other people.” Abdullahi an-Naim, who is now a law professor at Emory University, translated the book into English; in his introduction, he writes, “Islam, being the final and universal religion according to Muslim belief, was offered first in tolerant and egalitarian terms in Mecca, where the Prophet preached equality and individual responsibility between all men and women without distinction on grounds of race, sex, or social origin. As that message was rejected in practice, and the Prophet and his few followers were persecuted and forced to migrate to Medina, some aspects of the message changed.”

As Taha puts it in “The Second Message of Islam,” whereas Muhammad propagated “verses of peaceful persuasion” during his Meccan period, in Medina “the verses of compulsion by the sword prevailed.” The Medinan verses are full of rules, coercion, and threats, including the orders for jihad, and in Taha’s view they were a historical adaptation to the reality of life in a seventh-century Islamic city-state, in which “there was no law except the sword.” At one point, Taha writes that two modest decrees of the Meccan verses—“You are only a reminder, you have no dominion over them”—were appended with a harsh Medinan edict: “Except he who shuns and disbelieves, on whom God shall inflict the greatest suffering.” In his distinctive rhetorical style, which combines dense exegesis with humanistic uplift, Taha observed, “It is as if God had said, ‘We have granted you, Muhammad, dominion over anyone who shuns and disbelieves, so that God shall subject him to minor suffering at your hands through fighting, then God shall also subject him to the greatest suffering in hell.’ . . . Thus the first two verses were abrogated or repealed by the two second verses.”

The Medinan verses, directed not to Muhammad alone but to the community of early believers, became the basis for Sharia as it was developed by legal scholars over the next few centuries—what Taha calls the “first message of Islam.” In Taha’s revisionist reading, the elevation of the Medinan verses was only a historical postponement—the Meccan verses, representing the ideal religion, would be revived when humanity had reached a stage of development capable of accepting them, ushering in a renewed Islam based on freedom and equality. Taha quoted a Hadith, or saying of the Prophet, that declared, “Islam started as a stranger, and it shall return as a stranger in the same way it started.” This “second message of Islam” is higher and better than the first, delivered by a messenger who came to seventh-century Arabia, in a sense, from the future. And, in the twentieth century, the time had come for Muslims finally to receive it. Taha offered a hermeneutical way out of the modern crisis of Islam, allowing Muslims to affirm their faith without having to live by an inhumane code.

Taha’s reputation and importance far exceeded his actual following, which never amounted to more than a few thousand intensely devoted Sudanese: the stories of overwhelming personal transformation that I heard from Naim, Osman, and other Republican Brothers were apparently common among his adherents. (Taha adapted the name of his old political party for his new spiritual movement; he was wary of substituting Islamist slogans for critical thinking.) He received visitors at his house in Omdurman, northwest of Khartoum, at all hours, engaging in a kind of continuous seminar in which he was unmistakably the instructor—Republican Brothers still call him Ustazh, or “revered teacher”—but one who welcomed argument. “He would listen with utmost respect,” a follower named Omer el-Garrai told me. “I never saw him frustrated, I never saw him angry, I never heard him shout.” Naim recalled, “Taha could not transmit his religious enlightenment to us by talking about it. We would see the fruit of it by his personal life style, in his attitudes. His honesty, his intellectual vigor, his serenity, his charisma—those are the things that we can observe, and from them I understood that this is someone who had a transformative religious experience.” Taha lived simply, urging his followers to do the same, and even today Republican Brothers are known for their lack of show in dress and in wedding ceremonies. An aura of saintliness hangs over stories I heard about Taha in Sudan, and, as with Gandhi, to whom he is sometimes compared, there’s an unappealingly remote quality to his moral example. A man named Anour Hassan recalled that when Taha’s twelve-year-old son vanished in the Blue Nile, in 1954, Taha calmly told people who wanted to continue looking for the boy, “No, he’s gone to a kinder father than I am.”

Perhaps the twentieth century was too soon for the second message of Islam. Taha was condemned for apostasy by Sudanese and Egyptian clerics, his movement was under constant attack from the fundamentalist Muslim Brotherhood, and his public appearances were banned by the government. Various rumors began to circulate: that Taha and his followers believed him to be a new prophet, or even a divinity; that Taha didn’t pray; that he was insane. His legacy became controversial even among liberal-minded Sudanese. One evening in July, I spoke with the moderate politician and intellectual Sadiq al-Mahdi on the terrace overlooking the garden of his palatial home in Omdurman. Mahdi, who twice served as Prime Minister of Sudan and was twice ousted, in 1967 and 1989, is an imposing man: he was wearing the traditional white djellabah and turban, and his beard was hennaed. He spoke respectfully of Taha but found him theologically unsound. “Amongst the Islamists, there are those who jump into the future and those who jump into the past,” Mahdi said, comparing Taha with Qutb. “Taha is amongst those who jump into the future. He definitely is for radical Islamic reform. But he based it on arguments that are not legitimate.” Mahdi, like many other modern Muslim thinkers, believes that the Koran already offers the basis for affirming democratic values; there is no need, as he put it, to perform “these somersaults.”

What’s truly remarkable about Taha is that he existed at all. In the midst of a gathering storm of Islamist extremism, he articulated a message of liberal reform that was rigorous, coherent, and courageous. His vision asked Muslims to abandon fourteen hundred years of accepted dogma in favor of a radical and demanding new methodology that would set them free from the burdens of traditional jurisprudence. Islamic law, with its harsh punishments and its repression of free thought, was, Taha argued, a human interpretation of the Medinan verses and the recorded words and deeds of the Prophet in Medina; developed in the early centuries after Muhammad, it was then closed off to critical revision for a millennium. When Taha spoke of “Sharia,” he meant the enlightened message of the Meccan verses, which is universal and eternal. To Muslims like Mahdi, this vision seemed to declare that part of the holy book was a mistake. Taha’s message requires of Muslims such an intellectual leap that those who actually made it—as opposed to those who merely admired Taha or were interested in him—took on the quality of cult members, with their white garments, street-corner sermons, and egalitarian marriage contracts. Small wonder that Taha failed to create a durable mass movement. In “Quest for Divinity,” a new and generally sympathetic study of Taha, to be published this fall, Professor Mohamed A. Mahmoud, of Tufts University, writes, “The outcome of this culture of guardianship and total intellectual dependency was a movement with impoverished inner intellectual and spiritual resources, intrinsically incapable of surviving Taha’s death.”

Why did the Sudanese state, the religious establishment, and the Islamist hard-liners consider the leader of such a small movement worth killing? Perhaps because, as Khalid el-Haj, a retired school administrator in Rufaa, who first met Taha in the early sixties, told me, “They are afraid of the ideas, not the numbers. They know that the ideas are from inside Islam and they cannot face it.”

Eventually, Taha’s teaching collided with Islamist power politics. Sudan’s military dictator, Jaafar al-Nimeiri, who had seized control of the country in 1969, was an opportunistic tyrant who had exhausted one model after another to justify his rule: Marxism, Arab nationalism, pro-Americanism. By the early eighties, Nimeiri’s hold on power was loosening, and he felt particularly threatened by one of his advisers: Hasan al-Turabi, the legal scholar, who had an increasingly energetic Islamist following. Turabi, a brilliant politician with a British and French education, was an authoritarian ideologue, more in the mold of a Bolshevik than a hidebound cleric. One of Turabi’s prime intellectual enemies was Taha, whose interpretation of the Koran he considered illegitimate. Taha, for his part, once dismissed Turabi as “clever but not insightful”—and many Sudanese believe that Turabi never forgot the slight.

In 1983, Nimeiri, aiming to counter Turabi’s growing popularity, decided to make his own Islamic claim. He hastily pushed through laws that imposed a severe version of Sharia on Sudan, including its Christian and animist south. Within eighteen months, more than fifty suspected thieves had their hands chopped off. A Coptic Christian was hanged for possessing foreign currency; poor women were flogged for selling local beer. It was exactly the kind of brutal, divisive, politically motivated Sharia that Taha had long warned against, and southerners intensified a decades-long civil war against Khartoum. Taha and other Republican Brothers, including Naim, had been jailed in advance by Nimeiri to prevent them from leading protests; their imprisonment lasted a year and a half.

Soon after Taha was released, he distributed a leaflet, on Christmas Day, 1984, titled “Either This or the Flood.” “It is futile for anyone to claim that a Christian person is not adversely affected by the implementation of sharia,” he wrote. “It is not enough for a citizen today merely to enjoy freedom of worship. He is entitled to the full rights of a citizen in total equality with all other citizens. The rights of southern citizens in their country are not provided for in sharia but rather in Islam at the level of fundamental Koranic revelation.”

Taha, who was now in his mid-seventies, had been preparing half his life for this moment. It was central to his vision that Islamic law in its historical form, rather than in what he considered its original, authentic meaning, would be a monstrous injustice in modern society. His opposition was brave and absolute, and yet his statement reveals the limits of a philosophy that he hoped to make universal. Taha opposed secularism—he once declared that the secular West “is not a civilization because its values are confused”—and he could not conceive of rights outside the framework of Islam and the Koran. At the very moment that he was defending non-believers from the second-class status enshrined in Islamic law, he was extending their equal rights through a higher, better Sharia.

Abdullahi an-Naim defends Taha’s approach, saying that in the Islamic world a Turkish-style secularism will always be self-defeating. “It is an illusion to think you can sustain constitutionalism, democratization, without addressing its Islamic foundation,” he said. “Because for Muslims you cannot say, ‘I’m a Muslim, but—’ That ‘but’ does not work. What unites Muslims is an idea. It is Islam as an idea. And therefore contesting that idea, I think, is going to be permanent.” Whenever secular intellectuals in Muslim countries try to bypass the question of Sharia, Naim said, “they leave the high moral ground to the fundamentalists, and they lose.” Invoking Islam as the highest authority for universal rights was not simply a matter of belief; it meant that Taha and his movement could stay in the game.

Soon after Taha’s Christmas statement was released, he was arrested again. This time, the government pressed charges amounting to apostasy, which carried the death penalty. Taha refused to recognize the legitimacy of the court under Sharia, refused to repent, and in a matter of hours was condemned to death. The hanging was scheduled for the morning of January 18, 1985. Among the hundreds of spectators in the vast courtyard of Kober Prison, in Khartoum North, was Judith Miller, then a Times reporter, disguised in a white cloak and head scarf. In the opening of her 1996 book, “God Has Ninety-nine Names,” Miller described the scene:

Shortly before the appointed time, Mahmoud Muhammad Taha was led into the courtyard. The condemned man, his hands tied behind him, was smaller than I expected him to be, and from where I sat, as his guards hustled him along, he looked younger than his seventy-six years. He held his head high and stared silently into the crowd. When they saw him, many in the crowd leaped to their feet, jeering and shaking their fists at him. A few waved their Korans in the air.
I managed to catch only a glimpse of Taha’s face before the executioner placed an oatmeal-colored sack over his head and body, but I shall never forget his expression: His eyes were defiant; his mouth firm. He showed no hint of fear.

In the instant that the trapdoor opened and Taha’s body fell through, the crowd began to scream, “Allahu Akbar! Allahu Akbar! Islam huwa al-hall! ”—“God is great! Islam is the solution!”—the slogan of the Muslim Brotherhood.

Some of Taha’s followers could not accept that he was dead—they had actually come to believe in Taha’s divinity—and they spent several days by one of the bridges spanning the Nile, waiting for him to appear. When he didn’t (his body was flown by helicopter to an unknown location in the desert for a hasty burial), the Republican Brotherhood essentially died. Some members, including Naim, went abroad; others stayed in Sudan but ceased any public activity. The regime forced a number of imprisoned Republican Brothers to repudiate Taha’s ideas in order to avoid his fate. His books were burned in public bonfires.

The execution appalled large numbers of Sudanese, who were unused to political violence, and it helped precipitate the downfall of Nimeiri, four months later, when a popular uprising restored democratic rule. January 18th became Arab Human Rights Day. In 2000, a Sudanese reporter asked Nimeiri about the death of Taha. Nimeiri expressed regret over the killing, then made a startling claim: Taha’s execution had been secretly engineered by Hasan al-Turabi.

“I didn’t want him killed,” Nimeiri said of Taha. “Turabi told me that Mahmoud Muhammad Taha wanted to side with the left against me and that the Republican Brothers are a force not to be underestimated, and that if he united with the left I am definitely doomed. Turabi brought me the order to execute him and asked me to sign off on it. . . . I decided to postpone my decision for two days, and on the third day I went to Taha, dressed in civilian clothes. I told him, ‘Your death would sadden me. Just back down on your decision.’ But he spoke to me in a way that at the time I felt was blustering but now I see it was honorable, considering the situation. He told me, ‘You back down on your decision. As for me, I know that I’m going to be killed. If I’m not killed in court, the Muslim Brotherhood will kill me in secret. So leave and let me be. I know that I am going to die.’ ”

I asked a number of people in Khartoum about the role that Turabi might have played in Taha’s death. “Turabi killed him” was the blunt verdict of Hyder Ibrahim Ali, a sociologist and the director of the Sudanese Studies Center. “I think Turabi was behind all this. Taha was a real rival for Turabi. At that time, the only people at the University of Khartoum as strong as the Muslim Brotherhood were the Republican Brothers.” Others echoed this view: even if Turabi hadn’t played a direct role in Taha’s death, Taha’s reform-minded movement had offered the most serious theological challenge to Turabi’s severe Islamism.

In the decade after Taha’s death, Turabi and his hard-line politics flourished. In 1989, he was the prime strategist of the Islamist revolution that followed the military overthrow of Prime Minister Sadiq al-Mahdi. He became the intellectual architect of the new regime, led by Omar al-Bashir, and presided over its reign of terror in the nineties. He was the impresario who attracted just about every leading jihadi terrorist to Sudan; journalists started calling him “the Khomeini of the Sunnis” and “the pope of terrorism.” In 1999, however, Turabi’s fortunes abruptly changed: he lost a power struggle with Bashir, who fired him.

This spring, Turabi, in a striking return to Sudanese politics, said some astonishing things about Islam. Though he had always been more supportive of women’s rights than other hard-liners, he was now declaring that women and men are equal, that women can lead Islamic prayers, that covering the hair is not obligatory, that apostasy should not be a crime. He said that Muslim women can marry Christians or Jews. Quotations in the Arab press made him sound like a liberal reformer. In Khartoum, people marvelled that he sounded exactly like Taha. Suhair Osman, the young woman I met at the University of Khartoum, informed me, with a wan smile, “It is said in the daily papers and in the discussion centers here in the university that Turabi killed Ustazh Mahmoud and now he’s stealing his ideas.”

In the next few decades, several Arab countries—Iraq, Palestine, perhaps Egypt and Algeria—may well come under some form of Islamist rule, either by election or by force. If so, they would do well to study the example of Sudan. A whole generation in Sudan has grown up under the hard-line ideology that was imposed by Turabi and his colleagues after 1989. “We are the wounded surgeons, we have had the plague,” Sadiq al-Mahdi told me. “We have been the guinea pig of this whole exercise, and you should listen to us.”

Islam is as diverse as Muslims themselves, but Islamism, thus far in its short history, tends to look the same wherever it appears. The Sudanese version was not a genuine revolution like the Iranian one; it was more of an élite project that never gained legitimacy outside of student, intellectual, and military circles. Still, Sudan’s hard-line party, the National Islamic Front, marched the country through familiar paces. Suliman Baldo, the director of the Africa program at the International Crisis Group, who lived through the years of Islamization in Khartoum and published a report documenting the return of slavery in Sudan, said of the government, “They came with a social-engineering project—they were very open about this.” Education became a form of indoctrination: small children learned jihadist chants; school uniforms were replaced with combat fatigues; students engaged in paramilitary drills and memorized the Koran; teachers overhauled the curriculum to focus on the glory of Arab and Islamic culture. Khartoum had been a socially relaxed city that celebrated Christmas, but now the morals police insured that women were veiled, especially in government offices and universities. The security agencies were taken over by Islamists, and torture chambers known as “ghost houses” proliferated in what had been a tolerant political culture. (Some torturers were reportedly trained by Iranian Revolutionary Guards.) Young men were conscripted into the new People’s Defense Force and sent to fight in the jihad against the infidels of the south, thousands of them crying “Allahu Akbar! ” as they went to their deaths. Turabi declared that the jihadis would ascend directly to Paradise. Actors simulated “weddings” between martyrs and heavenly virgins on state television. Turabi gave asylum and assistance to terrorists, including bin Laden and other Al Qaeda members, and Sudan soon made enemies of every one of its many neighbors, along with the United States. And so an ethnically and religiously mixed African country, with an egalitarian brand of Sufism as its dominant form of Islam, was mobilized by intellectuals and soldiers to create a militaristic, ideologically extreme state whose main achievements were civil war, slavery, famine, and mass death.

Sometime in the late nineties, Turabi realized that his grand enterprise was a failure. Sudan had come under United Nations sanctions for sponsoring a 1995 assassination attempt on President Hosni Mubarak, of Egypt. The country was internationally isolated; the civil war was killing millions. And the Islamist project was bankrupt. As in Iran, it had produced an increasingly wealthy and corrupt ruling class of ideologues and security officers, while young Sudanese, including many of Turabi’s followers, left the country or turned inward.

It was at this low point that Omar al-Bashir expelled Turabi from the government. Until last year, Turabi found himself in and out of jail, and he began to rethink his politics. He declared that the war in the south had not been a jihad after all but, rather, a meaningless waste. In prison, he began to write about where the Islamists had gone wrong. The problem, he decided, was a failure to adhere to principles of democracy and human rights. This spring, Turabi began attracting attention with his liberal statements about women and Islam. He welcomed the deployment of a United Nations force to the Darfur region, where the government had launched a campaign of ethnic cleansing, and he mocked bin Laden for threatening to mount a jihad against the peacekeepers. (Some analysts believe that Turabi had a hand in the rebellion that preceded the mass killings in the region, but no one has been able to prove it.) His remarks were so radical that they earned him charges of apostasy by clerics in Sudan and Saudi Arabia. The Saudi edition of the Sudanese newspaper that quoted his proclamations had the offending lines torn out of every copy.

In Khartoum, people used the same phrase over and over: there had been “a hundred-and-eighty-degree turn” in Turabi’s views. I heard several explanations. Sadiq al-Mahdi, the former Prime Minister, believed that Turabi was trying to atone for the damage he had inflicted on Sudan. Others saw old opportunism under new slogans: Turabi realized that, thanks to Islamist misrule, democracy would be the next wave in Sudan, and he wanted to get out in front of it. There was also the possibility that he couldn’t bear to be ignored.

One day in late July, during a hard Sahara windstorm that obscured the merciless sun and left sand in my molars, Turabi received me in his office on the outskirts of Khartoum, beyond the airport. I found him sitting behind a vast desk, which was almost bare; so were the bookcases next to it, as if he were waiting for someone to refurnish the trappings of power. Turabi is now seventy-four years old. He has a trim white beard and bright eyes framed by elegant wire-rim glasses; he wore a white djellabah and turban, white patent-leather loafers, and flower-patterned polyester socks. He has a resonant voice, which, when the topic turns serious, often breaks into a disconcerting giggle, accompanied by a bucktoothed grin. Turabi is inexhaustible: before I arrived, he had spoken for three days to members of his breakaway political party, but he required scarcely any prompting to carry on a nearly three-hour monologue with me. It was like trying to follow the flight path of a mosquito: he would leave sentences unfinished, switch subjects in the span of a clause, swallow the point in a laugh, then suddenly alight somewhere—on hidebound Saudi clerics, clueless Algerian Islamists, pigheaded Sudanese soldiers, shortsighted American politicians—and draw blood.

Turabi presented himself as older but wiser, free now to be the one independent thinker in a world of ideologues, an emissary for both sides in the war between Islam and the West, unafraid of uttering any truth, including truths about his own mistakes—but whenever I tried to pin him down on one he blamed someone else and claimed that his role was exaggerated. “Oh, Turabi, he’s the ‘pope of terrorism,’ of fundamentalism, the pope noir du terrorisme! ” he mocked. The Bush Administration’s war on terror, he said, was a gigantic misunderstanding based on a failure to communicate. As for the Islamic revival, it held no dangers for the West. “Oh, no, it’s positive!” he said. “What is our economic model? It’s not the Soviet model. It’s not the old capitalist model, or the feudal model. It’s your model! What is our political model? It’s your model! Almost the same model! O.K.?”

Toward the end of his discourse, I mentioned that a number of Sudanese had heard echoes of Mahmoud Muhammad Taha in his recent statements. For the first time, Turabi lost his good humor. “Ooh,” he groaned. He called Taha “an apostate” who was “not normal,” and he insisted that, far from being behind Taha’s death, he had argued with Nimeiri for his life: “I said, ‘Why do you jail this man? He won’t hurt you, he’s not against this regime. He thinks he’s the impersonation of Jesus Christ!’ ” Turabi laughed dismissively. “I said, ‘Let him go and advocate his message. He will persuade a few people for some time. He’s not harmful to you.’ ” He said of Taha, “From early days, I don’t read his books, I don’t mention his name. Even if people ask me questions, I try to evade, because in every society, in America, you have had these cult people—everyone has to drink the killing material! Jim Jones!”

Turabi giggled and stood up to say goodbye.

When I had asked Abdullahi an-Naim about Turabi’s recent statements on women, minorities, and Islam, he had scoffed, “He has no methodology.” It was true: Turabi threw out opinions like handfuls of seed. But, as Taha had said, the one constant in his long career has been cleverness. Turabi seemed to recognize that, in the ruins of his own making in Sudan, his countrymen required a new notion of Islam and government. Great turns in history seldom come because someone writes a manifesto or proposes a theory. Instead, concrete experience, usually in the form of catastrophic failure, forces people to search for new ideas, many of which have been lying around for quite a while. Naim, who had fled the country after the 1989 coup, went back to Sudan in 2003 to find that “people were totally disillusioned about the Islamist project. They could see that it was corrupting and corrupt.” In reaction, a small but growing number of Sudanese have come under the influence of Saudi Wahhabism—turning to an even more extreme theology as the pure Islam. Others, such as Osman Mirghani, a newspaper columnist and a former follower of Turabi, have concluded that the problem in Sudan has less to do with religion than with its civic culture. Mirghani has formed a new citizens’ movement, Sudan Forum, waging its first campaign against corruption in high places.

Taha’s solution to the modern Muslim dilemma hovers over the conversations of Sudanese who are too young to have any memory of him. In a dingy office in downtown Khartoum, I met a man named Hussein and a woman named Buthina, two social activists who are just the kind of idealists that the Islamists used to attract. In 1989, as a teen-ager, Hussein had at first welcomed the new government. He soon realized that its promises of Islamic justice were false, and he was traumatized by the year he spent as a conscript in the jihad against the south. “In my view, this regime is a great shame in the history of Islam,” he said. “It’s pushed people away from Islam. Their mentality has changed. They are no longer abiding by Islamic regulations.” He mentioned prostitution, drinking, and corruption. For all Hussein’s disillusionment, he still believed in Sharia—in flogging for fornication, stoning for adultery, and beheading for apostasy—but he wanted it to be applied under a democratic government grounded in human rights. Buthina shook her head; Islamist rule had turned her toward secularism. “This is a very, very sensitive issue,” she said. “When you design your regulations and constitution, you have to accept that all the people look at this constitution and see themselves in it. Otherwise, they will not implement it. If we design the constitution and the law of the country on Islam, this will create a problem.”

When I described Hussein to Naim, he said, “He sees the corruption of the current regime, and he sees the unworkability of an Islamic state, but he has no alternative. That is the point about Taha. Taha provides an alternative. As the crisis intensifies, the receptivity to something like Taha’s ideas will grow.” The misrule of Turabi and the Sudanese Islamists, Naim said, had done more to advance the project of reforming Sharia than Taha’s followers could ever have achieved. At the same time, he admitted that most people in Sudan today have never heard of Taha. All that is left of his movement is a few hundred followers, some of whom gather in the evenings at a house in Omdurman. I was invited to join them there one night: the men sat in chairs on one side of the courtyard, the women on the other, but they mixed more than the religious Muslims at most gatherings. All dressed in white, they chanted traditional Sufi songs and a mournful hymn about their martyred leader.

The hollowness at the core of Sudan, and the widespread cynicism about Islamist rule, with its enforced ideology and rituals, is reminiscent of Eastern Europe in the years before the fall of the Berlin Wall. But if you spend time in an Islamic country you soon realize that the Communism analogy runs dry. For Islam, unlike Marxism, is deeply rooted and still present in everyday life in profound ways. As such, it is an irresistible mobilizing tool for politicians: an Islamist leader in Morocco, Nadia Yassin, once said, “If I go into the streets and I call people to come with me to a demonstration, and I talk to them about Che Guevara and Lenin, nobody will go out. But if I start talking about Muhammad and Ali and Aisha and all the prophets of Islam, they will follow me.” Islam remains the system of values by which Muslims live; it is strong enough to survive Islamism. Perhaps, in time, the religion’s centrality will subside, but, for the foreseeable future, the Islamic enlightenment in which so many Western thinkers have placed their hopes—that is, secularism—will not sweep the Muslim world. The Islamic revival, and its attendant struggles and ills, is less like the eighteenth century in Europe than like the sixteenth, the age of Luther, when the most sensitive and ambitious Englishmen, Frenchmen, and Germans devoted their efforts to finding in the words of the Bible a meaning for which they were prepared to live and die.

On the wall of Naim’s office at Emory University, just above a picture of his parents, there is a black-and-white portrait of Taha in old age, seated, with the folds of a white robe draped over his shoulders and the Sudanese turban wrapped around his head; his gaze is both direct and abstracted, taking in something far beyond the camera. Ever since the night Naim attended Taha’s lecture as a young law student, he has believed that Muslims must find a way out of the predicament in which their own history has placed them—if not by accepting Taha’s vision, then by working toward another.

“I don’t really have high hopes for change in the Arab region, because it is too self-absorbed in its own sense of superiority and victimhood,” he said. His hope lies in the periphery—West Africa, the Sahel, Central and Southeast Asia: “They are not noticed, but that’s where the hope is.” The damage done to Muslim lives under the slogan “Islam is the solution,” and Islamism’s failure to solve their daily problems and answer people’s deepest needs, has forced younger Muslims in countries like Indonesia, Turkey, and Morocco to approach religion and politics in a more sophisticated way. Naim’s newest project, which he calls a work of advocacy more than of scholarship, is a manuscript called “The Future of Sharia.” Even before its English publication, he has begun to post it on the Web, translated into Persian, Urdu, Bengali, Turkish, Arabic, and Bahasa Indonesia. Its theme is more radical than anything he has written before; although it is based on his long devotion to Taha’s ideas, it goes beyond them and, according to some of Taha’s followers, leaves them behind. “The Future of Sharia” amounts to a kind of secularism: it proposes not a rigid separation of politics and religion, as in Turkey, but, rather, a scheme in which Islam informs political life but cannot be introduced into law by an appeal to any religious authority. Otherwise, Muslims would not be free. “I need a secular state to be a Muslim,” Naim said. “If I don’t have the freedom to disbelieve, I cannot believe.”

Two days after we spoke, Naim flew to Nigeria to give a series of lectures, based on the new book, in the northern states that have imposed a particularly harsh form of Sharia. He plans to travel next year to Indonesia and, if possible, to Iran. Two years ago, when he lectured in northern Nigeria, a quarter of his audience of eight hundred people walked out on him, and he had to slip away through a side door. He acknowledged that violence, even murder, might be the response this time. But Naim believes that, despite the evidence of the headlines, Islamic history is moving in his direction.

“In Sudan this simplistic answer failed,” Naim said. “In Iran it failed. In northern Nigeria it failed. In Pakistan it failed. As these experiences fail, people are going to realize that there is no shortcut—that you have to confront the hard questions.” His message to Muslims on his travels will be this: “I have been that way and I’ve seen the street is closed and I came back. And I see someone rushing and I tell him, this street is deadlocked, and he will not take my word and go all the way and discover that it is deadlocked and come back.” He will tell them, “Listen, you don’t have to do this, you don’t have to go down this dead-end street. There is an Arabic expression: ‘The fortunate ones will learn from the mistakes of others, the unfortunate ones will learn from their own mistakes.’ ”

By taking his message to the Muslim public and risking his own life, Naim is, perhaps unconsciously, following the example of one of the intellectual heroes of modern Islam. The first years of the twenty-first century hardly seem hospitable to Mahmoud Muhammad Taha’s humane vision, but his words are there for young Muslims to discover once they get to the end of the street and need a way to turn around. ♦

Posts: 3048
Joined: 15 Feb 2013, 18:28

Re: Haunt Jemima

Post by my2cents » 07 Feb 2015, 10:28

ቅሌታም ትውልድና ኣስታዬቶታቻቸው

Teza (2008) NYT Critics' Pick

Mypheduh Films
Aaron Arefe in a scene from “Teza.”
Lacking Shelter at Home and Abroad
Published: April 1, 2010
It’s all in the eyes. Remember that as you watch “Teza.”
For Filmmaker, Ethiopia’s Struggle Is His Own (March 30, 2010)
Written and directed by the Ethiopian-born filmmaker Haile Gerima (“Sankofa,” “Ashes and Embers”) over more than a decade, this film is an autobiographical drama about a rural villager who journeys to Europe from Ethiopia and back again. He sees his country transformed from a pseudomonarchial dictatorship into an equally savage Marxist hellhole; gains an education and loses his innocence; falls in and out of love; makes and loses friends; and endures enough trauma to fill nine lives. Yet he ultimately finds reason to truly live again, rather than merely exist.

Blending thumbnail sketches of 20th-century European and African history, intimate personal drama, nightmares, hallucinations and meditative landscape shots, Mr. Gerima’s film has all the hallmarks of a career summation — and early on it seems fated to collapse beneath the weight of its ambitions.

Instead, it soars, thanks to Mr. Gerima’s bracingly direct storytelling. We see this confused, vicious, sprawling world refracted through, and reflected in, the eyes of this movie’s hero, Anberber — a onetime villager turned Westernized doctor and intellectual who enters the film in middle age, gray-haired and potbellied, limping home on a prosthetic leg. As played by the Ethiopian-American actor Aaron Arefe, the character is a psychologically complex individual with an Everyman’s charm.

We see Anberber watch leftist soldiers conscript a teenage boy and brutalize that boy’s weeping mother. We see him try (and fail) to connect with his own mother, who can’t bear to hear the details of her son’s pain. We watch him flirt with an expatriate African beauty in Cologne, Germany (“I believe in socialism as long as I have my own girlfriend”), then slowly warm to a shunned local woman to whom his mother has given shelter.

We look on as Anberber hotheadedly denounces his own government, then swallows his pride and recants before a “self-criticism tribunal” (a pungently Orwellian phrase). Anberber’s reactions are the movie’s universal translator. Whatever larger points are being made about the politics or culture of a time or place, we’re always in the moment, looking out through the hero’s eyes. We feel what he’s thinking.

The film’s disruptive flash cuts and symbolically loaded juxtapositions (portraits of Marx, Lenin and Stalin replacing those of the deposed Ethiopian emperor Haile Selassie; a leaky faucet summoning repressed memories of a bloody victim of mob violence) channel Mr. Gerima’s previous movies, as well as works by politically minded contemporaries, particularly Ken Loach (“Land and Freedom”).

But the unfussy directness of Mr. Gerima’s filmmaking also harks back to the touchstones of mid-20th-century European art cinema, “Hiroshima Mon Amour” and “Wild Strawberries” — intimate dramas showing how the past invades and defines the present.

“Teza” confronts big subjects: the corrosive impact of white European racism, the ugly similarity between right- and left-wing groupthink, Ethiopian class conflict, the moral equivocations of expatriate intellectuals, the corruption in postcolonial communist governments.

Yet the movie never degenerates into a laundry list of pet issues because Mr. Gerima’s sensibility is humanist and fundamentally decent and sane. He doesn’t just reject political, philosophical, sexual, racial and spiritual dogma of every sort. He seems to view dogma itself as the one true evil: the ideological armor of bullies throughout history; the enemy of freedom, of art, of happiness itself.

Posts: 3048
Joined: 15 Feb 2013, 18:28

Re: Haunt Jemima

Post by my2cents » 08 Feb 2015, 07:17


At Grammys Event, Bob Dylan Speech Steals the Show

FEBRUARY 7, 2015
LOS ANGELES — At the Grammys’ annual charity gala on Friday, Bob Dylan stole the show without singing a single note.

In a wide-ranging 35-minute speech that had the 3,000 or so music executives and stars in the audience hanging on his every word, Mr. Dylan touched on the roots of his songwriting, the musicians who inspired him, and the naysaying of critics and others along the way.

It was an extremely rare and revealing speech from Mr. Dylan, 73, but in his usual fashion it was anything but straightforward. Reading from a thick cache of papers, he spoke in what at times was a kind of rhapsodic, canny prose-poetry, like one of his lyrics or an outtake from his 2004 memoir, “Chronicles, Volume One.”

“These songs of mine,” he said, “they’re like mystery plays, the kind Shakespeare saw when he was growing up. I think you could trace what I do back that far. They were on the fringes then, and I think they’re on the fringes now.”

Mr. Dylan was accepting the person of the year award from MusiCares, a charity affiliated with the Grammys that supports musicians in financial need or in health crises. Since MusiCares began in 1989, it has distributed nearly $40 million in aid, according to the group, and the event on Friday, at the Los Angeles Convention Center, raised a record $7 million through sales of tickets and memorabilia.

The night was packed with performances of Mr. Dylan’s songs by the likes of Bruce Springsteen; Neil Young; Norah Jones; Sheryl Crow; Willie Nelson; Jack White; Tom Jones; and Crosby, Stills and Nash. The award to Mr. Dylan was presented by former President Jimmy Carter, who said that Mr. Dylan’s “words on peace and human rights are much more incisive, much more powerful and much more permanent than those of any president of the United States.”

Mr. Dylan began with thanks to people who helped his career early on, like John Hammond, the storied talent scout who signed him to Columbia Records, and Peter, Paul and Mary, whose version of “Blowin’ in the Wind” gave Mr. Dylan his first big hit, in 1963. He paid tribute to Joan Baez, Jimi Hendrix and Johnny Cash, and also thanked the Byrds, the Turtles and Sonny and Cher, whose covers brought him more pop hits, even if, he said, he never wanted to be a pop songwriter.

“Their versions of songs were like commercials,” he said. “But I didn’t really mind that, because 50 years later my songs were used for commercials. So that was good too.”

He gave a lesson in the folk-inspired songwriting process, saying that “my songs didn’t just come out of thin air — I didn’t just make them up.” Giving numerous examples, Mr. Dylan showed how the traditional songs he sang in his youth inspired his own writing “subliminally and unconsciously.”

“If you sang ‘John Henry’ as many times as me — ‘John Henry was steel-driving man, driving with a hammer in his hand, John Henry said a man ain’t nothing but a man,’ ” he said. “If you sang that song as many times as I did, you would have written ‘How many roads must a man walk down,’ too.”

Mr. Dylan took jabs at music icons like the songwriters Leiber and Stoller (“Yakety Yak,” “Stand by Me”), saying that he didn’t care that they didn’t like his songs, because he didn’t like theirs either. Nashville wasn’t spared. In barely diplomatic terms, Mr. Dylan mocked the country songwriter Tom T. Hall, saying that his sentimental 1973 song “I Love” (“I love baby ducks, old pickup trucks”) was “a little overcooked,” and implying that Mr. Hall was part of an old guard that was bemused and left behind by the musical revolution of the 1960s and ‘70s.

But he saved most of his bile for critics, clearly showing that he has read enough of his reviews over the years to let them get under his skin. “Critics have been giving me a hard time since Day 1,” he said. “Critics say I can’t sing. I croak. Sound like a frog.” He paused, as nervous giggles spread through the crowd. “Why don’t critics say the same thing about Tom Waits?” (Well, actually, they do.)

Mr. Dylan wound up his speech with tender comments about his friend Billy Lee Riley, a 1950s rockabilly singer on Sun Records, the original home of Elvis Presley and Johnny Cash. Known to collectors for his 1957 song “Red Hot,” Mr. Riley never made it big. When he got sick, Mr. Dylan said, MusiCares helped pay Mr. Riley’s medical bills and mortgage, to help make his life “at least comfortable, tolerable, to the end, and that is something that can’t be repaid.” Mr. Riley died in 2009.

Usually the annual MusiCares concerts, which in the past have featured stars like Barbra Streisand, Paul McCartney and Mr. Springsteen, end with a performance by the honoree, but after a few quick photos from the stage Mr. Dylan was off, and Mr. Young ended the night with a haunting “Blowin’ in the Wind.” As the high-heeled and tuxedoed crowd filtered out, few seemed to complain.

Posts: 3048
Joined: 15 Feb 2013, 18:28

Re: Haunt Jemima

Post by my2cents » 08 Feb 2015, 07:22


For Saudis and Pakistan, a Bird of Contention

FEBRUARY 7, 2015
For decades, royal Arab hunting expeditions have traveled to the far reaches of Pakistan in pursuit of the houbara bustard — a waddling, migratory bird whose meat, they believe, contains aphrodisiac powers.

Little expense is spared for the elaborate winter hunts. Cargo planes fly tents and luxury jeeps into custom-built desert airstrips, followed by private jets carrying the kings and princes of Persian Gulf countries along with their precious charges: expensive hunting falcons that are used to kill the white-plumed houbara.

This year’s hunt, however, has run into difficulty.

It started in November, when the High Court in Baluchistan, the vast and tumultuous Pakistani province that is a favored hunting ground, canceled all foreign hunting permits in response to complaints from conservationists.

Those experts say the houbara’s habitat, and perhaps the long-term survival of the species, which is already considered threatened, has been endangered by the ferocious pace of hunting.

That legal order ballooned into a minor political crisis last week when a senior Saudi prince and his entourage landed in Baluchistan, attracting unusually critical media attention and a legal battle that is scheduled to reach the country’s Supreme Court in the coming days.

Anger among conservationists was heightened by the fact that the prince — Fahd bin Sultan bin Abdul Aziz, the governor of Tabuk province — along with his entourage had killed 2,100 houbara over 21 days during last year’s hunt, according to an official report leaked to the Pakistani news media, or about 20 times more than his allocated quota.

Still, Prince Fahd faced little censure when he touched down in Dalbandin, a dusty town near the Afghan border on Wednesday, to be welcomed by a delegation led by a cabinet minister and including senior provincial officials.

His reception was a testament, critics say, to the money-driven magnetism of Saudi influence in Pakistan, and the walk-on role of the humble bustard in cementing that relationship.

“This is a clear admission of servility to the rich Arabs,” said Pervez Hoodbhoy, a physics professor and longtime critic of what he calls “Saudization” in Pakistan. “They come here, hunt with impunity, and are given police protection in spite of the fact that they are violating local laws.”

The dispute has focused attention on a practice that started in the 1970s, when intensive hunting in the Persian Gulf nearly rendered the houbara extinct there, and with it a cherished tradition considered the sport of kings.

As the houbara migrated from its breeding grounds in Siberia, newly enriched Persian Gulf royalty flocked to the deserts and fields of Pakistan, where they were welcomed with open arms by the country’s leaders.

For the Pakistanis, the hunt has become an opportunity to earn money and engage in a form of soft diplomacy.

Although only 29 foreigners have been permitted houbara licenses this year, according to press reports, they include some of the wealthiest and most powerful men in the Middle East, including the kings of Bahrain and Saudi Arabia, the Emir of Kuwait and the ruler of Dubai.

Their devotion to the houbara can seem mysterious to outsiders. The bird’s meat is bitter and stringy, and its supposed aphrodisiac properties are not supported by scientific evidence.

But falcon hunting, and the pursuit of the houbara, occupy a romantic place in the Bedouin Arab culture.

In Pakistan, the lavish nature of the winter hunts, which take place largely away from public scrutiny, have become the stuff of legend. In the early ’90s, it was reported, the Saudi king arrived in Pakistan with a retinue of dancing camels.

To curry favor with local communities, the Arab hunters have built roads, schools, madrassas and mosques, as well as several international-standard airstrips in unlikely places.

The only airport, at Rahim Yar Khan in the south of Punjab Province, is named after Sheikh Zayed bin Sultan al-Nahayan, the former ruler of Abu Dhabi.

In recent times the hunts have also played a role, albeit unwitting, in the United States’s war against Al Qaeda.

Osama bin Laden took refuge at a houbara hunting camp in western Afghanistan in the late 1990s, by several accounts, at a time when the C.I.A. was plotting to assassinate him with a missile strike.

The journalist Steve Coll wrote in his Pulitzer Prize-winning book, “Ghost Wars,” that American officials declined to take the shot, fearing that the Arab sheikh who was hosting Bin Laden would have been at risk of dying in the attack.

For several years starting in 2004, the C.I.A. used an Arab-built airstrip at Shamsi, a barren desert valley in central Baluchistan, to launch drone strikes against Islamist militants in Pakistan’s tribal belt.

When news of the American base stirred a scandal in Pakistan’s Parliament in 2011, the country’s air force chief sought to deflect blame onto the United Arab Emirates government.

The deserts around Dalbandin, where Prince Fahd landed on Wednesday, were the site of Pakistan’s first nuclear test explosion in 1998, and are an established way station for heroin smugglers and Taliban insurgents.

But the growing influence of Gulf Arab countries is not universally appreciated. Progressive Pakistanis bemoan their conservative influence on society, and the infusion of petrodollars for jihadi groups.

The hunts have also come under attack. In Baluchistan, where the houbara is the provincial symbol, some royal hunts had to be curtailed after Baluch separatist rebels opened fire on hunting parties.

Now the battle has shifted to the capital, Islamabad. The prime minister, Nawaz Sharif, enjoys close relations with the rulers of Saudi Arabia, where he spent much of his exile between 2000 and 2007 — one reason, critics believe, for the indulgence shown toward Prince Fahd.

Mr. Sharif sent his federal planning minister, Ahsan Iqbal, to greet Prince Fahd in Dalbandin, as well as Baluchistan’s minister for sports and culture.

“Not a single political leader reacted against illegal hunting by Arab princes,” Asma Jahangir, a prominent human rights campaigner, posted on Twitter.

Although Mr. Sharif never confirmed it, Saudi Arabia is widely believed to have injected $1.5 billion into Mr. Sharif’s government last year to help prop up the ailing economy. Last year in Islamabad, Mr. Sharif laid out a lavish welcome for the other Saudi hunting permit holder: Salman bin Abdul-Aziz Al Saud, who last month was inaugurated as king.

The International Union for the Conservation of Nature has termed the houbara a vulnerable species, and India has banned the hunt. The Baluchistan court order in November cited Pakistan’s obligation to international conservation treaties.

Hunt supporters say the houbara population has never been scientifically surveyed, and complain that the royal visits are being unnecessarily politicized.

“The foreigners are a blessing, not a problem,” said Ernest Shams of Houbara Foundation International Pakistan, a charity that works with the United Arab Emirates government to boost houbara stocks. “They bring so much money into the country.”

In a bid to overcome the court ban, the Baluchistan government has lodged an appeal in Pakistan’s Supreme Court that is likely to be heard on Wednesday, officials in Islamabad said Friday.

Prince Fahd is currently at his hunting camp in Bar Tagzi, surrounded by his falcons and a contingent of security — and most definitely not hunting any houbara, according to Pakistani officials.

“They are visiting development sites,” said Obaidullah Jan Babat, an adviser to the Baluchistan chief minister. “They are not hunting.”

Posts: 3048
Joined: 15 Feb 2013, 18:28

Re: Haunt Jemima

Post by my2cents » 08 Feb 2015, 07:55

Newest Cameras from Canon and Olympus Push the Pixels

Canon’s 5D S camera is currently the world’s highest resolution full-frame DSLR, with a 50.6-megapixel sensor

‘Escape From Tomorrow': A Feature Film Shot in Disney Theme Parks Without Disney’s Permission
Monday, January 21st, 2013 by Peter Sciretta

Independent film is filled with dreamers who are too naive to believe in the impossible — filmmakers who don’t concern themselves with the millions of reasons not to make a movie. Some of the best works of art are created from this naivety.

Escape From Tomorrow is a movie that takes place during a family vacation to the Walt Disney World Resort in Orlando, Florida. Not only do the filmmakers make no attempt to hide or obscure the location, but the Disney theme park and costumed characters play a huge part in the story. Most of the movie was shot in Magic Kingdom, Epcot and Disneyland without the knowledge or permission of Disney. This is a film that, from a conventional perspective, should never have been created, never mind screened at the top independent film festival in the United States. But it was, and after the break we’ll tell you how it was done.

Director Randy Moore went into the park with his actors and a tiny crew and shot the film entirely on the Cannon 5D DSLR (5D Mark III) camera. The cinematographer and AD conducted intensive location scouting, with every shot exhaustively planned and blocked in advance. They even charted the position of the sun weeks in advance for each shot of the movie to make up for the lack of lighting equipment. Sound was recorded without an on-set sound mixer, sometimes using smart phones, and sometimes using digital recorders taped to each actor, which would record an entire day’s worth of audio, which editors had to sort through afterward.

While the film was shot guerrilla-style, it doesn’t feel like found footage or home video. They chose to shoot the entire movie in black & white, a practical decision which helped them have a better feel for composition and lighting in the camera as they shot in the parks. The result is a film with a certain classic feel, which adds to the cinematic aesthetic. The director argues that most people haven’t seen the Disey theme parks (especially the Disney World parks) in black and white, and that the style brings out details that normally go unnoticed. Disney fans will probably relate the footage with Walt Disney’s early telecasts from Disneyland.

The movie takes place over the course of one day, following a family vacationing at Walt Disney World. The father receives a phone call early in the morning informing him that he no longer has a job. He tells no one — all he wants to do is have one last day of vacation in the park. His relationship with his wife strained, and the day isn’t the pleasurable escape from reality that he had hoped. He begins to start fantasizing and following a couple French-speaking teenage girls, experiencing delusions around the park that may hint to a more sinister underbelly of Disney World.
The cast and crew bought season passes to enter the parks as “normal visitors”, filming ten days in the Orlando parks and two weeks in Disneyland in Anaheim. Most viewers would never notice, but the Disney World presented in the film is actually a amalgamation of Disney parks on both US coasts. You’ll see the family walking through Cinderella’s castle in Magic Kingdom, and minutes later be in line for Buzz Lightyear’s Astro Blasters in Disneyland’s Tomorrowland (Star Tours can be seen in the background, a ride that doesn’t even exist in MK).

I live in Los Angeles and have the top level annual pass to the Disneyland parks; I’m a Disneyland fanatic that goes to Disneyland a dozen times each year. I’ve been on every ride, watched every show, and I’ve probably seen anything there is to see in the public areas of the parks. My family vacationed to Walt Disney World a couple times growing up, and more recently, I visited Orlando on an anniversary trip a year ago.
This is a long way of saying that I know the experiences of the park more than most. The film is partly a story of one family’s day adventure through the parks, and I’m not sure how to experience this adventure through new eyes. But maybe that’s the point. Almost everyone has visited a Disney theme park and the experiences are universal. So when things begin to go haywire, it feels all that more surreal.

Near the end of the shoot they were almost caught by Disney while filming the family entering the Disneyland gates. The Disney castmembers thought that the camera crew were a bunch of paparazzi trying to get a shot of a famous family. (Remember, they were shooting with a DSLR camera.) The cast and crew were taken aside and the family insisted they were not famous. A castmemeber kept asking “Why did you enter the park two times in seven minutes?” Luckily the young girl in the cast began screaming that she needed to go to the bathroom. The cast and crew escaped after a crowded parade began on Main Street, the wireless sound mics shoved into their socks in case they were stopped. But they weren’t.

When I first heard about this film I thought that it was a total no-budget guerrilla production, but that is far from that case. The actors are professional, but still leave a lot to be desired compared to the other films at Sundance. Golden Globe nominated composer Abel Korzeniowski (A Single Man) recorded music for the film on the Eastwood stage of the Warner Bros lot. Visual effects were completed by the same company in South Korea that worked on The Host. Green screen and set production took place at the same movie stages used by Cecil B de Mille and DW Griffith.

Escape From Tomorrow is not a great film. The story has some good ideas, but the execution is uneven. And yet, it is unlike anything you’ve seen before and will probably be unlike anything you see again. For at reason alone, I would recommend you see this film if you have the chance.

The film shines in the more trippy moments, when it becomes about something more than a family vacationing at DisneyWorld. Spoilers for a movie that might never be released coming up: Jim discovers that the Disney princesses are high priced undercover hookers that service Asian businessmen. During one sequence, the animatronic kid figures in It’s a Small World start making weird faces and doing strange thing as Jim and his family ride through. Jim sees a vision of the huge Spaceship Earth building (the big ball at EPCOT) destroyed using explosives. Seeing imagery of a terrorist-like attack in the Disney parks is unnerving. And near the end, Jim is captured and taken into a building below Spaceship Earth by the evil siemens corporation (which sponsor the ride). There is more I’m leaving out, but those are the major crazy beats. (end of invisotext – highlight to see)

“If you have the chance” are the key words, as I don’t expect you ever will. Disney is very protective of their image and events happen in this movie that they would no way want to be connected with. For instance, a bunch of Disney princesses are revealed to really be undercover hookers for Asian businessmen. The film also features some sex and nudity, though those scenes were not shot inside the park.

Disney characters and intellectual property appear in almost every shot, with no attempt to cover or cut them out of the frame. In fact, some of the classic Mickey characters appear on props and set decoration used in the sequences they shot outside of the park in fully crafted sets on a studio lot. The only thing the filmmaker chose to censor is one mention of “Disney” by one of the main characters. The move in itself is confusing as the word Disney appears in text many times in shots from around the park.

Intellectual property and copyrights aside, many people appear in this film who have never signed a release. Real families and children are seen in the background of almost every shot. None of them gave permission or knew they were being filmed for a feature film. This also includes castmembers in the parks, not just in costume form. In one scene early on the family get a photo in front of Cinderella’s Castle taken ones Disney castmember with a camera. Close-ups of real Castmembers waving with Mickey hands are featured alas the family exit the park in another sequence. So there are many legal reasons why this film will probably never be publicly available outside of the few screenings at Sundance.

While Moore embraces the DisneyWorld location to a possibly extreme legal fault, paradoxically he did decide to replace the iconic (copy-written) music in It’s A Small World and the Enchanted Tiki Room, and replaced the film projected in Soarin’ with generic stock footage of flyover shots. How strange is it that the filmmakers thought it would be okay to have actresses playing hookers dressed up as Disney Princesses in the park, to feature actual Disney art prominently on screen, but drew the line at Sherman Brothers-composed theme park music?

Escape From Tomorrow isn’t the first movie to be shot in Disneyland without permission of the Mouse. Banksy’s documentary Exit Through the Gift Shop features a very memorable sequence where Banksy plays a prank inside the park and Mr. Brainwash (who shot footage of the prank) was questioned Disney’s backstage jail. He escaped with the footage, and they appeared in the film without any fallout that we’re aware of. Of course, Exit is a documentary and so that footage probably falls under fair use. They also avoid showing any Disney intellectual property in close-up.

And last year a viral short was made titled Missing In The Mansion, which was shot inside Disneyland and inside the famous Haunted Mansion attraction. The found footage horror film was posted online for free, and Disney has made no apparent attempts to get it taken down. But Escape is the first fictional feature film production that I know of to shoot a significant portion of the film in the Disney Parks without an “OK”

Posts: 3048
Joined: 15 Feb 2013, 18:28

Re: Haunt Jemima

Post by my2cents » 08 Feb 2015, 08:02


The New Yorker | Annals of Medicine

The Trip Treatment
Research into psychedelics, shut down for decades, is now yielding exciting results

Psilocybin may be useful in treating anxiety, addiction, and depression, and in studying the neurobiology of mystical experience

On an April Monday in 2010, Patrick Mettes, a fifty-four-year-old television news director being treated for a cancer of the bile ducts, read an article on the front page of the Times that would change his death. His diagnosis had come three years earlier, shortly after his wife, Lisa, noticed that the whites of his eyes had turned yellow. By 2010, the cancer had spread to Patrick’s lungs and he was buckling under the weight of a debilitating chemotherapy regimen and the growing fear that he might not survive. The article, headlined “HALLUCINOGENS HAVE DOCTORS TUNING IN AGAIN,” mentioned clinical trials at several universities, including N.Y.U., in which psilocybin—the active ingredient in so-called magic mushrooms—was being administered to cancer patients in an effort to relieve their anxiety and “existential distress.” One of the researchers was quoted as saying that, under the influence of the hallucinogen, “individuals transcend their primary identification with their bodies and experience ego-free states . . . and return with a new perspective and profound acceptance.” Patrick had never taken a psychedelic drug, but he immediately wanted to volunteer. Lisa was against the idea. “I didn’t want there to be an easy way out,” she recently told me. “I wanted him to fight.”

Patrick made the call anyway and, after filling out some forms and answering a long list of questions, was accepted into the trial. Since hallucinogens can sometimes bring to the surface latent psychological problems, researchers try to weed out volunteers at high risk by asking questions about drug use and whether there is a family history of schizophrenia or bipolar disorder. After the screening, Mettes was assigned to a therapist named Anthony Bossis, a bearded, bearish psychologist in his mid-fifties, with a specialty in palliative care. Bossis is a co-principal investigator for the N.Y.U. trial.

After four meetings with Bossis, Mettes was scheduled for two dosings—one of them an “active” placebo (in this case, a high dose of niacin, which can produce a tingling sensation), and the other a pill containing the psilocybin. Both sessions, Mettes was told, would take place in a room decorated to look more like a living room than like a medical office, with a comfortable couch, landscape paintings on the wall, and, on the shelves, books of art and mythology, along with various aboriginal and spiritual tchotchkes, including a Buddha and a glazed ceramic mushroom. During each session, which would last the better part of a day, Mettes would lie on the couch wearing an eye mask and listening through headphones to a carefully curated playlist—Brian Eno, Philip Glass, Pat Metheny, Ravi Shankar. Bossis and a second therapist would be there throughout, saying little but being available to help should he run into any trouble.

I met Bossis last year in the N.Y.U. treatment room, along with his colleague Stephen Ross, an associate professor of psychiatry at N.Y.U.’s medical school, who directs the ongoing psilocybin trials. Ross, who is in his forties, was dressed in a suit and could pass for a banker. He is also the director of the substance-abuse division at Bellevue, and he told me that he had known little about psychedelics—drugs that produce radical changes in consciousness, including hallucinations—until a colleague happened to mention that, in the nineteen-sixties, LSD had been used successfully to treat alcoholics. Ross did some research and was astounded at what he found.

“I felt a little like an archeologist unearthing a completely buried body of knowledge,” he said. Beginning in the nineteen-fifties, psychedelics had been used to treat a wide variety of conditions, including alcoholism and end-of-life anxiety. The American Psychiatric Association held meetings centered on LSD. “Some of the best minds in psychiatry had seriously studied these compounds in therapeutic models, with government funding,” Ross said.

Between 1953 and 1973, the federal government spent four million dollars to fund a hundred and sixteen studies of LSD, involving more than seventeen hundred subjects. (These figures don’t include classified research.) Through the mid-nineteen-sixties, psilocybin and LSD were legal and remarkably easy to obtain. Sandoz, the Swiss chemical company where, in 1938, Albert Hofmann first synthesized LSD, gave away large quantities of Delysid—LSD—to any researcher who requested it, in the hope that someone would discover a marketable application. Psychedelics were tested on alcoholics, people struggling with obsessive-compulsive disorder, depressives, autistic children, schizophrenics, terminal cancer patients, and convicts, as well as on perfectly healthy artists and scientists (to study creativity) and divinity students (to study spirituality). The results reported were frequently positive. But many of the studies were, by modern standards, poorly designed and seldom well controlled, if at all. When there were controls, it was difficult to blind the researchers—that is, hide from them which volunteers had taken the actual drug. (This remains a problem.)

By the mid-nineteen-sixties, LSD had escaped from the laboratory and swept through the counterculture. In 1970, Richard Nixon signed the Controlled Substances Act and put most psychedelics on Schedule 1, prohibiting their use for any purpose. Research soon came to a halt, and what had been learned was all but erased from the field of psychiatry. “By the time I got to medical school, no one even talked about it,” Ross said.

The clinical trials at N.Y.U.—a second one, using psilocybin to treat alcohol addiction, is now getting under way—are part of a renaissance of psychedelic research taking place at several universities in the United States, including Johns Hopkins, the Harbor-U.C.L.A. Medical Center, and the University of New Mexico, as well as at Imperial College, in London, and the University of Zurich. As the drug war subsides, scientists are eager to reconsider the therapeutic potential of these drugs, beginning with psilocybin. (Last month The Lancet, the United Kingdom’s most prominent medical journal, published a guest editorial in support of such research.) The effects of psilocybin resemble those of LSD, but, as one researcher explained, “it carries none of the political and cultural baggage of those three letters.” LSD is also stronger and longer-lasting in its effects, and is considered more likely to produce adverse reactions. Researchers are using or planning to use psilocybin not only to treat anxiety, addiction (to smoking and alcohol), and depression but also to study the neurobiology of mystical experience, which the drug, at high doses, can reliably occasion. Forty years after the Nixon Administration effectively shut down most psychedelic research, the government is gingerly allowing a small number of scientists to resume working with these powerful and still somewhat mysterious molecules.

As I chatted with Tony Bossis and Stephen Ross in the treatment room at N.Y.U., their excitement about the results was evident. According to Ross, cancer patients receiving just a single dose of psilocybin experienced immediate and dramatic reductions in anxiety and depression, improvements that were sustained for at least six months. The data are still being analyzed and have not yet been submitted to a journal for peer review, but the researchers expect to publish later this year.

“I thought the first ten or twenty people were plants—that they must be faking it,” Ross told me. “They were saying things like ‘I understand love is the most powerful force on the planet,’ or ‘I had an encounter with my cancer, this black cloud of smoke.’ People who had been palpably scared of death—they lost their fear. The fact that a drug given once can have such an effect for so long is an unprecedented finding. We have never had anything like it in the psychiatric field.”

I was surprised to hear such unguarded enthusiasm from a scientist, and a substance-abuse specialist, about a street drug that, since 1970, has been classified by the government as having no accepted medical use and a high potential for abuse. But the support for renewed research on psychedelics is widespread among medical experts. “I’m personally biased in favor of these type of studies,” Thomas R. Insel, the director of the National Institute of Mental Health (N.I.M.H.) and a neuroscientist, told me. “If it proves useful to people who are really suffering, we should look at it. Just because it is a psychedelic doesn’t disqualify it in our eyes.” Nora Volkow, the director of the National Institute on Drug Abuse (NIDA), emphasized that “it is important to remind people that experimenting with drugs of abuse outside a research setting can produce serious harms.”

Many researchers I spoke with described their findings with excitement, some using words like “mind-blowing.” Bossis said, “People don’t realize how few tools we have in psychiatry to address existential distress. Xanax isn’t the answer. So how can we not explore this, if it can recalibrate how we die?”

Herbert D. Kleber, a psychiatrist and the director of the substance-abuse division at the Columbia University–N.Y. State Psychiatric Institute, who is one of the nation’s leading experts on drug abuse, struck a cautionary note. “The whole area of research is fascinating,” he said. “But it’s important to remember that the sample sizes are small.” He also stressed the risk of adverse effects and the importance of “having guides in the room, since you can have a good experience or a frightful one.” But he added, referring to the N.Y.U. and Johns Hopkins research, “These studies are being carried out by very well trained and dedicated therapists who know what they’re doing. The question is, is it ready for prime time?”

The idea of giving a psychedelic drug to the dying was conceived by a novelist: Aldous Huxley. In 1953, Humphry Osmond, an English psychiatrist, introduced Huxley to mescaline, an experience he chronicled in “The Doors of Perception,” in 1954. (Osmond coined the word “psychedelic,” which means “mind-manifesting,” in a 1957 letter to Huxley.) Huxley proposed a research project involving the “administration of LSD to terminal cancer cases, in the hope that it would make dying a more spiritual, less strictly physiological process.” Huxley had his wife inject him with the drug on his deathbed; he died at sixty-nine, of laryngeal cancer, on November 22, 1963.

Psilocybin mushrooms first came to the attention of Western medicine (and popular culture) in a fifteen-page 1957 Life article by an amateur mycologist—and a vice-president of J. P. Morgan in New York—named R. Gordon Wasson. In 1955, after years spent chasing down reports of the clandestine use of magic mushrooms among indigenous Mexicans, Wasson was introduced to them by María Sabina, a curandera—a healer, or shaman—in southern Mexico. Wasson’s awed first-person account of his psychedelic journey during a nocturnal mushroom ceremony inspired several scientists, including Timothy Leary, a well-regarded psychologist doing personality research at Harvard, to take up the study of psilocybin. After trying magic mushrooms in Cuernavaca, in 1960, Leary conceived the Harvard Psilocybin Project, to study the therapeutic potential of hallucinogens. His involvement with LSD came a few years later.

In the wake of Wasson’s research, Albert Hofmann experimented with magic mushrooms in 1957. “Thirty minutes after my taking the mushrooms, the exterior world began to undergo a strange transformation,” he wrote. “Everything assumed a Mexican character.” Hofmann proceeded to identify, isolate, and then synthesize the active ingredient, psilocybin, the compound being used in the current research.

Perhaps the most influential and rigorous of these early studies was the Good Friday experiment, conducted in 1962 by Walter Pahnke, a psychiatrist and minister working on a Ph.D. dissertation under Leary at Harvard. In a double-blind experiment, twenty divinity students received a capsule of white powder right before a Good Friday service at Marsh Chapel, on the Boston University campus; ten contained psilocybin, ten an active placebo (nicotinic acid). Eight of the ten students receiving psilocybin reported a mystical experience, while only one in the control group experienced a feeling of “sacredness” and a “sense of peace.” (Telling the subjects apart was not difficult, rendering the double-blind a somewhat hollow conceit: those on the placebo sat sedately in their pews while the others lay down or wandered around the chapel, muttering things like “God is everywhere” and “Oh, the glory!”) Pahnke concluded that the experiences of eight who received the psilocybin were “indistinguishable from, if not identical with,” the classic mystical experiences reported in the literature by William James, Walter Stace, and others.

In 1991, Rick Doblin, the director of the Multidisciplinary Association for Psychedelic Studies (MAPS), published a follow-up study, in which he tracked down all but one of the divinity students who received psilocybin at Marsh Chapel and interviewed seven of them. They all reported that the experience had shaped their lives and work in profound and enduring ways. But Doblin found flaws in Pahnke’s published account: he had failed to mention that several subjects struggled with acute anxiety during their experience. One had to be restrained and given Thorazine, a powerful antipsychotic, after he ran from the chapel and headed down Commonwealth Avenue, convinced that he had been chosen to announce that the Messiah had arrived.

The first wave of research into psychedelics was doomed by an excessive exuberance about their potential. For people working with these remarkable molecules, it was difficult not to conclude that they were suddenly in possession of news with the power to change the world—a psychedelic gospel. They found it hard to justify confining these drugs to the laboratory or using them only for the benefit of the sick. It didn’t take long for once respectable scientists such as Leary to grow impatient with the rigmarole of objective science. He came to see science as just another societal “game,” a conventional box it was time to blow up—along with all the others.

Was the suppression of psychedelic research inevitable? Stanislav Grof, a Czech-born psychiatrist who used LSD extensively in his practice in the nineteen-sixties, believes that psychedelics “loosed the Dionysian element” on America, posing a threat to the country’s Puritan values that was bound to be repulsed. (He thinks the same thing could happen again.) Roland Griffiths, a psychopharmacologist at Johns Hopkins University School of Medicine, points out that ours is not the first culture to feel threatened by psychedelics: the reason Gordon Wasson had to rediscover magic mushrooms in Mexico was that the Spanish had suppressed them so thoroughly, deeming them dangerous instruments of paganism.

“There is such a sense of authority that comes out of the primary mystical experience that it can be threatening to existing hierarchical structures,” Griffiths told me when we met in his office last spring. “We ended up demonizing these compounds. Can you think of another area of science regarded as so dangerous and taboo that all research gets shut down for decades? It’s unprecedented in modern science.”

Early in 2006, Tony Bossis, Stephen Ross, and Jeffrey Guss, a psychiatrist and N.Y.U. colleague, began meeting after work on Friday afternoons to read up on and discuss the scientific literature on psychedelics. They called themselves the P.R.G., or Psychedelic Reading Group, but within a few months the “R” in P.R.G. had come to stand for “Research.” They had decided to try to start an experimental trial at N.Y.U., using psilocybin alongside therapy to treat anxiety in cancer patients. The obstacles to such a trial were formidable: Would the F.D.A. and the D.E.A. grant permission to use the drug? Would N.Y.U.’s Institutional Review Board, charged with protecting experimental subjects, allow them to administer a psychedelic to cancer patients? Then, in July of 2006, the journal Psychopharmacology published a landmark article by Roland Griffiths, et al., titled “Psilocybin Can Occasion Mystical-Type Experiences Having Substantial and Sustained Personal Meaning and Spiritual Significance.”

“We’re upgrading our business to something worse.”
“We all rushed in with Roland’s article,” Bossis recalls. “It solidified our confidence that we could do this work. Johns Hopkins had shown it could be done safely.” The article also gave Ross the ammunition he needed to persuade a skeptical I.R.B. “The fact that psychedelic research was being done at Hopkins—considered the premier medical center in the country—made it easier to get it approved here. It was an amazing study, with such an elegant design. And it opened up the field.” (Even so, psychedelic research remains tightly regulated and closely scrutinized. The N.Y.U. trial could not begin until Ross obtained approvals first from the F.D.A., then from N.Y.U.’s Oncology Review Board, and then from the I.R.B., the Bellevue Research Review Committee, the Bluestone Center for Clinical Research, the Clinical and Translational Science Institute, and, finally, the Drug Enforcement Administration, which must grant the license to use a Schedule 1 substance.)

Griffiths’s double-blind study reprised the work done by Pahnke in the nineteen-sixties, but with considerably more scientific rigor. Thirty-six volunteers, none of whom had ever taken a hallucinogen, received a pill containing either psilocybin or an active placebo (methylphenidate, or Ritalin); in a subsequent session the pills were reversed. “When administered under supportive conditions,” the paper concluded, “psilocybin occasioned experiences similar to spontaneously occurring mystical experiences.” Participants ranked these experiences as among the most meaningful in their lives, comparable to the birth of a child or the death of a parent. Two-thirds of the participants rated the psilocybin session among the top five most spiritually significant experiences of their lives; a third ranked it at the top. Fourteen months later, these ratings had slipped only slightly.

Furthermore, the “completeness” of the mystical experience closely tracked the improvements reported in personal well-being, life satisfaction, and “positive behavior change” measured two months and then fourteen months after the session. (The researchers relied on both self-assessments and the assessments of co-workers, friends, and family.) The authors determined the completeness of a mystical experience using two questionnaires, including the Pahnke-Richards Mystical Experience Questionnaire, which is based in part on William James’s writing in “The Varieties of Religious Experience.” The questionnaire measures feelings of unity, sacredness, ineffability, peace and joy, as well as the impression of having transcended space and time and the “noetic sense” that the experience has disclosed some objective truth about reality. A “complete” mystical experience is one that exhibits all six characteristics. Griffiths believes that the long-term effectiveness of the drug is due to its ability to occasion such a transformative experience, but not by changing the brain’s long-term chemistry, as a conventional psychiatric drug like Prozac does.

A follow-up study by Katherine MacLean, a psychologist in Griffiths’s lab, found that the psilocybin experience also had a positive and lasting effect on the personality of most participants. This is a striking result, since the conventional wisdom in psychology holds that personality is usually fixed by age thirty and thereafter is unlikely to substantially change. But more than a year after their psilocybin sessions volunteers who had had the most complete mystical experiences showed significant increases in their “openness,” one of the five domains that psychologists look at in assessing personality traits. (The others are conscientiousness, extroversion, agreeableness, and neuroticism.) Openness, which encompasses aesthetic appreciation, imagination, and tolerance of others’ viewpoints, is a good predictor of creativity.

“I don’t want to use the word ‘mind-blowing,’ ” Griffiths told me, “but, as a scientific phenomenon, if you can create conditions in which seventy per cent of people will say they have had one of the five most meaningful experiences of their lives? To a scientist, that’s just incredible.”

The revival of psychedelic research today owes much to the respectability of its new advocates. At sixty-eight, Roland Griffiths, who was trained as a behaviorist and holds senior appointments in psychiatry and neuroscience at Hopkins, is one of the nation’s leading drug-addiction researchers. More than six feet tall, he is rail-thin and stands bolt upright; the only undisciplined thing about him is a thatch of white hair so dense that it appears to have held his comb to a draw. His long, productive relationship with NIDA has resulted in some three hundred and fifty papers, with titles such as “Reduction of Heroin Self-Administration in Baboons by Manipulation of Behavioral and Pharmacological Conditions.” Tom Insel, the director of the N.I.M.H., described Griffiths as “a very careful, thoughtful scientist” with “a reputation for meticulous data analysis. So it’s fascinating that he’s now involved in an area that other people might view as pushing the edge.”

Griffiths’s career took an unexpected turn in the nineteen-nineties after two serendipitous introductions. The first came when a friend introduced him to Siddha Yoga, in 1994. He told me that meditation acquainted him with “something way, way beyond a material world view that I can’t really talk to my colleagues about, because it involves metaphors or assumptions that I’m really uncomfortable with as a scientist.” He began entertaining “fanciful thoughts” of quitting science and going to India.

In 1996, an old friend and colleague named Charles R. (Bob) Schuster, recently retired as the head of NIDA, suggested that Griffiths talk to Robert Jesse, a young man he’d recently met at Esalen, the retreat center in Big Sur, California. Jesse was neither a medical professional nor a scientist; he was a computer guy, a vice-president at Oracle, who had made it his mission to revive the science of psychedelics, as a tool not so much of medicine as of spirituality. He had organized a gathering of researchers and religious figures to discuss the spiritual and therapeutic potential of psychedelic drugs and how they might be rehabilitated.

When the history of second-wave psychedelic research is written, Bob Jesse will be remembered as one of two scientific outsiders who worked for years, mostly behind the scenes, to get it off the ground. (The other is Rick Doblin, the founder of MAPS.) While on leave from Oracle, Jesse established a nonprofit called the Council on Spiritual Practices, with the aim of “making direct experience of the sacred more available to more people.” (He prefers the term “entheogen,” or “God-facilitating,” to “psychedelic.”) In 1996, the C.S.P. organized the historic gathering at Esalen. Many of the fifteen in attendance were “psychedelic elders,” researchers such as James Fadiman and Willis Harman, both of whom had done early psychedelic research while at Stanford, and religious figures like Huston Smith, the scholar of comparative religion. But Jesse wisely decided to invite an outsider as well: Bob Schuster, a drug-abuse expert who had served in two Republican Administrations. By the end of the meeting, the Esalen group had decided on a plan: “to get aboveboard, unassailable research done, at an institution with investigators beyond reproach,” and, ideally, “do this without any promise of clinical treatment.” Jesse was ultimately less interested in people’s mental disorders than in their spiritual well-being—in using entheogens for what he calls “the betterment of well people.”

Shortly after the Esalen meeting, Bob Schuster (who died in 2011) phoned Jesse to tell him about his old friend Roland Griffiths, whom he described as “the investigator beyond reproach” Jesse was looking for. Jesse flew to Baltimore to meet Griffiths, inaugurating a series of conversations and meetings about meditation and spirituality that eventually drew Griffiths into psychedelic research and would culminate, a few years later, in the 2006 paper in Psychopharmacology.

The significance of the 2006 paper went far beyond its findings. The journal invited several prominent drug researchers and neuroscientists to comment on the study, and all of them treated it as a convincing case for further research. Herbert Kleber, of Columbia, applauded the paper and acknowledged that “major therapeutic possibilities” could result from further psychedelic research studies, some of which “merit N.I.H. support.” Solomon Snyder, the Hopkins neuroscientist who, in the nineteen-seventies, discovered the brain’s opioid receptors, summarized what Griffiths had achieved for the field: “The ability of these researchers to conduct a double-blind, well-controlled study tells us that clinical research with psychedelic drugs need not be so risky as to be off-limits to most investigators.”

“I’ve been thinking. Maybe we just got off to a bad start.”
Roland Griffiths and Bob Jesse had opened a door that had been tightly shut for more than three decades. Charles Grob, at U.C.L.A., was the first to step through it, winning F.D.A. approval for a Phase I pilot study to assess the safety, dosing, and efficacy of psilocybin in the treatment of anxiety in cancer patients. Next came the Phase II trials, just concluded at both Hopkins and N.Y.U., involving higher doses and larger groups (twenty-nine at N.Y.U.; fifty-six at Hopkins)—including Patrick Mettes and about a dozen other cancer patients in New York and Baltimore whom I recently interviewed.

Since 2006, Griffiths’s lab has conducted a pilot study on the potential of psilocybin to treat smoking addiction, the results of which were published last November in the Journal of Psychopharmacology. The sample is tiny—fifteen smokers—but the success rate is striking. Twelve subjects, all of whom had tried to quit multiple times, using various methods, were verified as abstinent six months after treatment, a success rate of eighty per cent. (Currently, the leading cessation treatment is nicotine-replacement therapy; a recent review article in the BMJ—formerly the British Medical Journal—reported that the treatment helped smokers remain abstinent for six months in less than seven per cent of cases.) In the Hopkins study, subjects underwent two or three psilocybin sessions and a course of cognitive-behavioral therapy to help them deal with cravings. The psychedelic experience seems to allow many subjects to reframe, and then break, a lifelong habit. “Smoking seemed irrelevant, so I stopped,” one subject told me. The volunteers who reported a more complete mystical experience had greater success in breaking the habit. A larger, Phase II trial comparing psilocybin to nicotine replacement (both in conjunction with cognitive behavioral therapy) is getting under way at Hopkins.

“We desperately need a new treatment approach for addiction,” Herbert Kleber told me. “Done in the right hands—and I stress that, because the whole psychedelic area attracts people who often think that they know the truth before doing the science—this could be a very useful one.”

Thus far, criticism of psychedelic research has been limited. Last summer, Florian Holsboer, the director of the Max Planck Institute of Psychiatry, in Munich, told Science, “You can’t give patients some substance just because it has an antidepressant effect on top of many other effects. That’s too dangerous.” Nora Volkow, of NIDA, wrote me in an e-mail that “the main concern we have at NIDA in relation to this work is that the public will walk away with the message that psilocybin is a safe drug to use. In fact, its adverse effects are well known, although not completely predictable.” She added, “Progress has been made in decreasing use of hallucinogens, particularly in young people. We would not want to see that trend altered.”

The recreational use of psychedelics is famously associated with instances of psychosis, flashback, and suicide. But these adverse effects have not surfaced in the trials of drugs at N.Y.U. and Johns Hopkins. After nearly five hundred administrations of psilocybin, the researchers have reported no serious negative effects. This is perhaps less surprising than it sounds, since volunteers are self-selected, carefully screened and prepared for the experience, and are then guided through it by therapists well trained to manage the episodes of fear and anxiety that many volunteers do report. Apart from the molecules involved, a psychedelic therapy session and a recreational psychedelic experience have very little in common.

The lab at Hopkins is currently conducting a study of particular interest to Griffiths: examining the effect of psilocybin on long-term meditators. The study plans to use fMRI—functional magnetic-resonance imaging—to study the brains of forty meditators before, during, and after they have taken psilocybin, to measure changes in brain activity and connectivity and to see what these “trained contemplatives can tell us about the experience.” Griffiths’s lab is also launching a study in collaboration with N.Y.U. that will give the drug to religious professionals in a number of faiths to see how the experience might contribute to their work. “I feel like a kid in a candy shop,” Griffiths told me. “There are so many directions to take this research. It’s a Rip Van Winkle effect—after three decades of no research, we’re rubbing the sleep from our eyes.”

“Ineffability” is a hallmark of the mystical experience. Many struggle to describe the bizarre events going on in their minds during a guided psychedelic journey without sounding like either a New Age guru or a lunatic. The available vocabulary isn’t always up to the task of recounting an experience that seemingly can take someone out of body, across vast stretches of time and space, and include face-to-face encounters with divinities and demons and previews of their own death.

Volunteers in the N.Y.U. psilocybin trial were required to write a narrative of their experience soon after the treatment, and Patrick Mettes, having worked in journalism, took the assignment seriously. His wife, Lisa, said that, after his Friday session, he worked all weekend to make sense of the experience and write it down.

When Mettes arrived at the treatment room, at First Avenue and Twenty-fifth Street, Tony Bossis and Krystallia Kalliontzi, his guides, greeted him, reviewed the day’s plan, and, at 9 A.M., presented him with a small chalice containing the pill. None of them knew whether it contained psilocybin or the placebo. Asked to state his intention, Mettes said that he wanted to learn to cope better with the anxiety and the fear that he felt about his cancer. As the researchers had suggested, he’d brought a few photographs along—of Lisa and him on their wedding day, and of their dog, Arlo—and placed them around the room.

At nine-thirty, Mettes lay down on the couch, put on the headphones and eye mask, and fell silent. In his account, he likened the start of the journey to the launch of a space shuttle, “a physically violent and rather clunky liftoff which eventually gave way to the blissful serenity of weightlessness.”

Several of the volunteers I interviewed reported feeling intense fear and anxiety before giving themselves up to the experience, as the guides encourage them to do. The guides work from a set of “flight instructions” prepared by Bill Richards, a Baltimore psychologist who worked with Stanislav Grof during the nineteen-seventies and now trains a new generation of psychedelic therapists. The document is a summary of the experience accumulated from managing thousands of psychedelic sessions—and countless bad trips—during the nineteen-sixties, whether these took place in therapeutic settings or in the bad-trip tent at Woodstock.

The “same force that takes you deep within will, of its own impetus, return you safely to the everyday world,” the manual offers at one point. Guides are instructed to remind subjects that they’ll never be left alone and not to worry about their bodies while journeying, since the guides will keep an eye on them. If you feel like you’re “dying, melting, dissolving, exploding, going crazy etc.—go ahead,” embrace it: “Climb staircases, open doors, explore paths, fly over landscapes.” And if you confront anything frightening, “look the monster in the eye and move towards it. . . . Dig in your heels; ask, ‘What are you doing in my mind?’ Or, ‘What can I learn from you?’ Look for the darkest corner in the basement, and shine your light there.” This training may help explain why the darker experiences that sometimes accompany the recreational use of psychedelics have not surfaced in the N.Y.U. and Hopkins trials.

Early on, Mettes encountered his brother’s wife, Ruth, who died of cancer more than twenty years earlier, at forty-three. Ruth “acted as my tour guide,” he wrote, and “didn’t seem surprised to see me. She ‘wore’ her translucent body so I would know her.” Michelle Obama made an appearance. “The considerable feminine energy all around me made clear the idea that a mother, any mother, regardless of her shortcomings . . . could never NOT love her offspring. This was very powerful. I know I was crying.” He felt as if he were coming out of the womb, “being birthed again.”

“Your first perp walk, Your Honor?”
Bossis noted that Mettes was crying and breathing heavily. Mettes said, “Birth and death is a lot of work,” and appeared to be convulsing. Then he reached out and clutched Kalliontzi’s hand while pulling his knees up and pushing, as if he were delivering a baby.

“Oh God,” he said, “it all makes sense now, so simple and beautiful.”

Around noon, Mettes asked to take a break. “It was getting too intense,” he wrote. They helped him to the bathroom. “Even the germs were beautiful, as was everything in our world and universe.” Afterward, he was reluctant to “go back in.” He wrote, “The work was considerable but I loved the sense of adventure.” He put on his eye mask and headphones and lay back down.

“From here on, love was the only consideration. It was and is the only purpose. Love seemed to emanate from a single point of light. And it vibrated.” He wrote that “no sensation, no image of beauty, nothing during my time on earth has felt as pure and joyful and glorious as the height of this journey.”

Then, at twelve-ten, he said something that Bossis jotted down: “O.K., we can all punch out now. I get it.”

He went on to take a tour of his lungs, where he “saw two spots.” They were “no big deal.” Mettes recalled, “I was being told (without words) not to worry about the cancer . . . it’s minor in the scheme of things . . . simply an imperfection of your humanity.”

Then he experienced what he called “a brief death.”

“I approached what appeared to be a very sharp, pointed piece of stainless steel. It had a razor blade quality to it. I continued up to the apex of this shiny metal object and as I arrived, I had a choice, to look or not look, over the edge and into the infinite abyss.” He stared into “the vastness of the universe,” hesitant but not frightened. “I wanted to go all in but felt that if I did, I would possibly leave my body permanently,” he wrote. But he “knew there was much more for me here.” Telling his guides about his choice, he explained that he was “not ready to jump off and leave Lisa.”

Around 3 P.M., it was over. “The transition from a state where I had no sense of time or space to the relative dullness of now, happened quickly. I had a headache.”

When Lisa arrived to take him home, Patrick “looked like he had run a race,” she recalled. “The color in his face was not good, he looked tired and sweaty, but he was fired up.” He told her he had touched the face of God.

Bossis was deeply moved by the session. “You’re in this room, but you’re in the presence of something large,” he recalled. “It’s humbling to sit there. It’s the most rewarding day of your career.”

Every guided psychedelic journey is different, but a few themes seem to recur. Several of the cancer patients I interviewed at N.Y.U. and Hopkins described an experience of either giving birth or being born. Many also described an encounter with their cancer that had the effect of diminishing its power over them. Dinah Bazer, a shy woman in her sixties who had been given a diagnosis of ovarian cancer in 2010, screamed at the black mass of fear she encountered while peering into her rib cage: “[deleted] you, I won’t be eaten alive!” Since her session, she says, she has stopped worrying about a recurrence—one of the objectives of the trial.

Great secrets of the universe often become clear during the journey, such as “We are all one” or “Love is all that matters.” The usual ratio of wonder to banality in the adult mind is overturned, and such ideas acquire the force of revealed truth. The result is a kind of conversion experience, and the researchers believe that this is what is responsible for the therapeutic effect.

Subjects revelled in their sudden ability to travel seemingly at will through space and time, using it to visit Elizabethan England, the banks of the Ganges, or Wordsworthian scenes from their childhood. The impediment of a body is gone, as is one’s identity, yet, paradoxically, a perceiving and recording “I” still exists. Several volunteers used the metaphor of a camera being pulled back on the scene of their lives, to a point where matters that had once seemed daunting now appeared manageable—smoking, cancer, even death. Their accounts are reminiscent of the “overview effect” described by astronauts who have glimpsed the earth from a great distance, an experience that some of them say permanently altered their priorities. Roland Griffiths likens the therapeutic experience of psilocybin to a kind of “inverse P.T.S.D.”—“a discrete event that produces persisting positive changes in attitudes, moods, and behavior, and presumably in the brain.”

Death looms large in the journeys taken by the cancer patients. A woman I’ll call Deborah Ames, a breast-cancer survivor in her sixties (she asked not to be identified), described zipping through space as if in a video game until she arrived at the wall of a crematorium and realized, with a fright, “I’ve died and now I’m going to be cremated. The next thing I know, I’m below the ground in this gorgeous forest, deep woods, loamy and brown. There are roots all around me and I’m seeing the trees growing, and I’m part of them. It didn’t feel sad or happy, just natural, contented, peaceful. I wasn’t gone. I was part of the earth.” Several patients described edging up to the precipice of death and looking over to the other side. Tammy Burgess, given a diagnosis of ovarian cancer at fifty-five, found herself gazing across “the great plain of consciousness. It was very serene and beautiful. I felt alone but I could reach out and touch anyone I’d ever known. When my time came, that’s where my life would go once it left me and that was O.K.”

I was struck by how the descriptions of psychedelic journeys differed from the typical accounts of dreams. For one thing, most people’s recall of their journey is not just vivid but comprehensive, the narratives they reconstruct seamless and fully accessible, even years later. They don’t regard these narratives as “just a dream,” the evanescent products of fantasy or wish fulfillment, but, rather, as genuine and sturdy experiences. This is the “noetic” quality that students of mysticism often describe: the unmistakable sense that whatever has been learned or witnessed has the authority and the durability of objective truth. “You don’t get that on other drugs,” as Roland Griffiths points out; after the fact, we’re fully aware of, and often embarrassed by, the inauthenticity of the drug experience.

This might help explain why so many cancer patients in the trials reported that their fear of death had lifted or at least abated: they had stared directly at death and come to know something about it, in a kind of dress rehearsal. “A high-dose psychedelic experience is death practice,” Katherine MacLean, the former Hopkins psychologist, said. “You’re losing everything you know to be real, letting go of your ego and your body, and that process can feel like dying.” And yet you don’t die; in fact, some volunteers become convinced by the experience that consciousness may somehow survive the death of their bodies.

In follow-up discussions with Bossis, Patrick Mettes spoke of his body and his cancer as a “type of illusion” and how there might be “something beyond this physical body.” It also became clear that, psychologically, at least, Mettes was doing remarkably well: he was meditating regularly, felt he had become better able to live in the present, and described loving his wife “even more.” In a session in March, two months after his journey, Bossis noted that Mettes “reports feeling the happiest in his life.”

How are we to judge the veracity of the insights gleaned during a psychedelic journey? It’s one thing to conclude that love is all that matters, but quite another to come away from a therapy convinced that “there is another reality” awaiting us after death, as one volunteer put it, or that there is more to the universe—and to consciousness—than a purely materialist world view would have us believe. Is psychedelic therapy simply foisting a comforting delusion on the sick and dying?

“That’s above my pay grade,” Bossis said, with a shrug, when I asked him. Bill Richards cited William James, who suggested that we judge the mystical experience not by its veracity, which is unknowable, but by its fruits: does it turn someone’s life in a positive direction?

Many researchers acknowledge that the power of suggestion may play a role when a drug like psilocybin is administered by medical professionals with legal and institutional sanction: under such conditions, the expectations of the therapist are much more likely to be fulfilled by the patient. (And bad trips are much less likely to occur.) But who cares, some argue, as long as it helps? David Nichols, an emeritus professor of pharmacology at Purdue University—and a founder, in 1993, of the Heffter Research Institute, a key funder of psychedelic research—put the pragmatic case most baldly in a recent interview with Science: “If it gives them peace, if it helps people to die peacefully with their friends and their family at their side, I don’t care if it’s real or an illusion.”

Roland Griffiths is willing to consider the challenge that the mystical experience poses to the prevailing scientific paradigm. He conceded that “authenticity is a scientific question not yet answered” and that all that scientists have to go by is what people tell them about their experiences. But he pointed out that the same is true for much more familiar mental phenomena.

“What about the miracle that we are conscious? Just think about that for a second, that we are aware we’re aware!” Insofar as I was on board for one miracle well beyond the reach of materialist science, Griffiths was suggesting, I should remain open to the possibility of others.

“I’m willing to hold that there’s a mystery here we don’t understand, that these experiences may or may not be ‘true,’ ” he said. “What’s exciting is to use the tools we have to explore and pick apart this mystery.”

Perhaps the most ambitious attempt to pick apart the scientific mystery of the psychedelic experience has been taking place in a lab based at Imperial College, in London. There a thirty-four-year-old neuroscientist named Robin Carhart-Harris has been injecting healthy volunteers with psilocybin and LSD and then using a variety of scanning tools—including fMRI and magnetoencephalography (MEG)—to observe what happens in their brains.

Carhart-Harris works in the laboratory of David Nutt, a prominent English psychopharmacologist. Nutt served as the drug-policy adviser to the Labour Government until 2011, when he was fired for arguing that psychedelic drugs should be rescheduled on the ground that they are safer than alcohol or tobacco and potentially invaluable to neuroscience. Carhart-Harris’s own path to neuroscience was an eccentric one. First, he took a graduate course in psychoanalysis—a field that few neuroscientists take seriously, regarding it less as a science than as a set of untestable beliefs. Carhart-Harris was fascinated by psychoanalytic theory but frustrated by the paucity of its tools for exploring what it deemed most important about the mind: the unconscious.

“If the only way we can access the unconscious mind is via dreams and free association, we aren’t going to get anywhere,” he said. “Surely there must be something else.” One day, he asked his seminar leader if that might be a drug. She was intrigued. He set off to search the library catalogue for “LSD and the Unconscious” and found “Realms of the Human Unconscious,” by Stanislav Grof. “I read the book cover to cover. That set the course for the rest of my young life.”

Carhart-Harris, who is slender and intense, with large pale-blue eyes that seldom blink, decided that he would use psychedelic drugs and modern brain-imaging techniques to put a foundation of hard science beneath psychoanalysis. “Freud said dreams were the royal road to the unconscious,” he said in our first interview. “LSD may turn out to be the superhighway.” Nutt agreed to let him follow this hunch in his lab. He ran bureaucratic interference and helped secure funding (from the Beckley Foundation, which supports psychedelic research).

When, in 2010, Carhart-Harris first began studying the brains of volunteers on psychedelics, neuroscientists assumed that the drugs somehow excited brain activity—hence the vivid hallucinations and powerful emotions that people report. But when Carhart-Harris looked at the results of the first set of fMRI scans—which pinpoint areas of brain activity by mapping local blood flow and oxygen consumption—he discovered that the drug appeared to substantially reduce brain activity in one particular region: the “default-mode network.”

The default-mode network was first described in 2001, in a landmark paper by Marcus Raichle, a neurologist at Washington University, in St. Louis, and it has since become the focus of much discussion in neuroscience. The network comprises a critical and centrally situated hub of brain activity that links parts of the cerebral cortex to deeper, older structures in the brain, such as the limbic system and the hippocampus.

The network, which consumes a significant portion of the brain’s energy, appears to be most active when we are least engaged in attending to the world or to a task. It lights up when we are daydreaming, removed from sensory processing, and engaging in higher-level “meta-cognitive” processes such as self-reflection, mental time travel, rumination, and “theory of mind”—the ability to attribute mental states to others. Carhart-Harris describes the default-mode network variously as the brain’s “orchestra conductor” or “corporate executive” or “capital city,” charged with managing and “holding the entire system together.” It is thought to be the physical counterpart of the autobiographical self, or ego.

“The brain is a hierarchical system,” Carhart-Harris said. “The highest-level parts”—such as the default-mode network—“have an inhibitory influence on the lower-level parts, like emotion and memory.” He discovered that blood flow and electrical activity in the default-mode network dropped off precipitously under the influence of psychedelics, a finding that may help to explain the loss of the sense of self that volunteers reported. (The biggest dropoffs in default-mode-network activity correlated with volunteers’ reports of ego dissolution.) Just before Carhart-Harris published his results, in a 2012 paper in Proceedings of the National Academy of Sciences, a researcher at Yale named Judson Brewer, who was using fMRI to study the brains of experienced meditators, noticed that their default-mode networks had also been quieted relative to those of novice meditators. It appears that, with the ego temporarily out of commission, the boundaries between self and world, subject and object, all dissolve. These are hallmarks of the mystical experience.

If the default-mode network functions as the conductor of the symphony of brain activity, we might expect its temporary disappearance from the stage to lead to an increase in dissonance and mental disorder—as appears to happen during the psychedelic journey. Carhart-Harris has found evidence in scans of brain waves that, when the default-mode network shuts down, other brain regions “are let off the leash.” Mental contents hidden from view (or suppressed) during normal waking consciousness come to the fore: emotions, memories, wishes and fears. Regions that don’t ordinarily communicate directly with one another strike up conversations (neuroscientists sometimes call this “crosstalk”), often with bizarre results. Carhart-Harris thinks that hallucinations occur when the visual-processing centers of the brain, left to their own devices, become more susceptible to the influence of our beliefs and emotions.

“He didn’t want to end it, so I told him I wanted to get married.”
JULY 24, 2000
Carhart-Harris doesn’t romanticize psychedelics, and he has little patience for the sort of “magical thinking” and “metaphysics” they promote. In his view, the forms of consciousness that psychedelics unleash are regressions to a more “primitive style of cognition.” Following Freud, he says that the mystical experience—whatever its source—returns us to the psychological condition of the infant, who has yet to develop a sense of himself as a bounded individual. The pinnacle of human development is the achievement of the ego, which imposes order on the anarchy of a primitive mind buffeted by magical thinking. (The developmental psychologist Alison Gopnik has speculated that the way young children perceive the world has much in common with the psychedelic experience. As she puts it, “They’re basically tripping all the time.”) The psychoanalytic value of psychedelics, in his view, is that they allow us to bring the workings of the unconscious mind “into an observable space.”

In “The Doors of Perception,” Aldous Huxley concluded from his psychedelic experience that the conscious mind is less a window on reality than a furious editor of it. The mind is a “reducing valve,” he wrote, eliminating far more reality than it admits to our conscious awareness, lest we be overwhelmed. “What comes out at the other end is a measly trickle of the kind of consciousness which will help us to stay alive.” Psychedelics open the valve wide, removing the filter that hides much of reality, as well as dimensions of our own minds, from ordinary consciousness. Carhart-Harris has cited Huxley’s metaphor in some of his papers, likening the default-mode network to the reducing valve, but he does not agree that everything that comes through the opened doors of perception is necessarily real. The psychedelic experience, he suggests, can yield a lot of “fool’s gold.”

Nevertheless, Carhart-Harris believes that the psychedelic experience can help people by relaxing the grip of an overbearing ego and the rigid, habitual thinking it enforces. The human brain is perhaps the most complex system there is, and the emergence of a conscious self is its highest achievement. By adulthood, the mind has become very good at observing and testing reality and developing confident predictions about it that optimize our investments of energy (mental and otherwise) and therefore our survival. Much of what we think of as perceptions of the world are really educated guesses based on past experience (“That fractal pattern of little green bits in my visual field must be a tree”), and this kind of conventional thinking serves us well.

But only up to a point. In Carhart-Harris’s view, a steep price is paid for the achievement of order and ego in the adult mind. “We give up our emotional lability,” he told me, “our ability to be open to surprises, our ability to think flexibly, and our ability to value nature.” The sovereign ego can become a despot. This is perhaps most evident in depression, when the self turns on itself and uncontrollable introspection gradually shades out reality. In “The Entropic Brain,” a paper published last year in Frontiers in Human Neuroscience, Carhart-Harris cites research indicating that this debilitating state, sometimes called “heavy self-consciousness,” may be the result of a “hyperactive” default-mode network. The lab recently received government funding to conduct a clinical study using psychedelics to treat depression.

Carhart-Harris believes that people suffering from other mental disorders characterized by excessively rigid patterns of thinking, such as addiction and obsessive-compulsive disorder, could benefit from psychedelics, which “disrupt stereotyped patterns of thought and behavior.” In his view, all these disorders are, in a sense, ailments of the ego. He also thinks that this disruption could promote more creative thinking. It may be that some brains could benefit from a little less order.

Existential distress at the end of life bears many of the psychological hallmarks of a hyperactive default-mode network, including excessive self-reflection and an inability to jump the deepening grooves of negative thought. The ego, faced with the prospect of its own dissolution, becomes hypervigilant, withdrawing its investment in the world and other people. It is striking that a single psychedelic experience—an intervention that Carhart-Harris calls “shaking the snow globe”—should have the power to alter these patterns in a lasting way.

This appears to be the case for many of the patients in the clinical trial of psilocybin just concluded at Hopkins and N.Y.U. Patrick Mettes lived for seventeen months after his psilocybin journey, and, according to Lisa, he enjoyed many unexpected satisfactions in that time, along with a dawning acceptance of death.

“We still had our arguments,” Lisa recalled. “And we had a very trying summer,” as they endured a calamitous apartment renovation. But Patrick “had a sense of patience he had never had before, and with me he had real joy about things,” she said. “It was as if he had been relieved of the duty of caring about the details of life. Now it was about being with people, enjoying his sandwich and the walk on the promenade. It was as if we lived a lifetime in a year.”

After the psilocybin session, Mettes spent his good days walking around the city. “He would walk everywhere, try every restaurant for lunch, and tell me about all these great places he’d discovered. But his good days got fewer and fewer.” In March, 2012, he stopped chemo. “He didn’t want to die,” she said. “But I think he just decided that this is not how he wanted to live.”

In April, his lungs failing, Mettes wound up back in the hospital. “He gathered everyone together and said goodbye, and explained that this is how he wanted to die. He had a very conscious death.”

Mettes’s equanimity exerted a powerful influence on everyone around him, Lisa said, and his room in the palliative-care unit at Mt. Sinai became a center of gravity. “Everyone, the nurses and the doctors, wanted to hang out in our room—they just didn’t want to leave. Patrick would talk and talk. He put out so much love.” When Tony Bossis visited Mettes the week before he died, he was struck by Mettes’s serenity. “He was consoling me. He said his biggest sadness was leaving his wife. But he was not afraid.”

Lisa took a picture of Patrick a few days before he died, and when it popped open on my screen it momentarily took my breath away: a gaunt man in a hospital gown, an oxygen clip in his nose, but with shining blue eyes and a broad smile.

Lisa stayed with him in his hospital room night after night, the two of them often talking into the morning hours. “I feel like I have one foot in this world and one in the next,” he told her at one point. Lisa told me, “One of the last nights we were together, he said, ‘Honey, don’t push me. I’m finding my way.’ ”

Lisa hadn’t had a shower in days, and her brother encouraged her to go home for a few hours. Minutes before she returned, Patrick slipped away. “He wasn’t going to die as long as I was there,” she said. “My brother had told me, ‘You need to let him go.’ ”

Lisa said she feels indebted to the people running the N.Y.U. trial and is convinced that the psilocybin experience “allowed him to tap into his own deep resources. That, I think, is what these mind-altering drugs do.”

Despite the encouraging results from the N.Y.U. and Hopkins trials, much stands in the way of the routine use of psychedelic therapy. “We don’t die well in America,” Bossis recently said over lunch at a restaurant near the N.Y.U. medical center. “Ask people where they want to die, and they will tell you at home, with their loved ones. But most of us die in an I.C.U. The biggest taboo in American medicine is the conversation about death. To a doctor, it’s a defeat to let a patient go.” Bossis and several of his colleagues described the considerable difficulty they had recruiting patients from N.Y.U. ’s cancer center for the psilocybin trials. “I’m busy trying to keep my patients alive,” one oncologist told Gabrielle Agin-Liebes, the trial’s project manager. Only when reports of positive experiences began to filter back to the cancer center did nurses there—not doctors—begin to tell patients about the trial.

“It’s meaningless, lady, believe me—I painted it.”
DECEMBER 3, 2007
Recruitment is only one of the many challenges facing a Phase III trial of psilocybin, which would involve hundreds of patients at multiple locations and cost millions of dollars. The University of Wisconsin and the University of California, Los Angeles, are making plans to participate in such a trial, but F.D.A. approval is not guaranteed. If the trial was successful, the government would be under pressure to reschedule psilocybin under the Controlled Substances Act, having recognized a medical use for the drug.

Also, it seems unlikely that the government would ever fund such a study. “The N.I.M.H. is not opposed to work with psychedelics, but I doubt we would make a major investment,” Tom Insel, the institute’s director, told me. He said that the N.I.M.H would need to see “a path to development” and suspects that “it would be very difficult to get a pharmaceutical company interested in developing this drug, since it cannot be patented.” It’s also unlikely that Big Pharma would have any interest in a drug that is administered only once or twice in the course of treatment. “There’s not a lot of money here when you can be cured with one session,” Bossis pointed out. Still, Bob Jesse and Rick Doblin are confident that they will find private money for a Phase III clinical trial, and several private funders I spoke to indicated that it would be forthcoming.

Many of the researchers and therapists I interviewed are confident that psychedelic therapy will eventually become routine. Katherine MacLean hopes someday to establish a “psychedelic hospice,” a retreat center where the dying and their loved ones can use psychedelics to help them all let go. “If we limit psychedelics just to the patient, we’re sticking with the old medical model,” she said. “But psychedelics are so much more radical than that. I get nervous when people say they should only be prescribed by a doctor.”

In MacLean’s thinking, one hears echoes of the excitement of the sixties about the potential of psychedelics to help a wide range of people, and the impatience with the cumbersome structures of medicine. It was precisely this exuberance about psychedelics, and the frustration with the slow pace of science, that helped fuel the backlash against them.

Still, “the betterment of well people,” to borrow a phrase of Bob Jesse’s, is very much on the minds of most of the researchers I interviewed, some of whom were more reluctant to discuss it on the record than institutional outsiders like Jesse and MacLean. For them, medical acceptance is a first step to a broader cultural acceptance. Jesse would like to see the drugs administered by skilled guides working in “longitudinal multigenerational contexts”—which, as he describes them, sound a lot like church communities. Others envisage a time when people seeking a psychedelic experience—whether for reasons of mental health or spiritual seeking or simple curiosity—could go to something like a “mental-health club,” as Julie Holland, a psychiatrist formerly at Bellevue, described it: “Sort of like a cross between a spa/retreat and a gym where people can experience psychedelics in a safe, supportive environment.” All spoke of the importance of well-trained guides (N.Y.U. has had a training program in psychedelic therapy since 2008, directed by Jeffrey Guss, a co-principal investigator for the psilocybin trials)* and the need to help people afterward “integrate” the powerful experiences they have had in order to render them truly useful. This is not something that happens when these drugs are used recreationally. Bossis paraphrases Huston Smith on this point: “A spiritual experience does not by itself make a spiritual life.”

When I asked Rick Doblin if he worries about another backlash, he suggested that the culture has made much progress since the nineteen-sixties. “That was a very different time,” he said. “People wouldn’t even talk about cancer or death then. Women were tranquillized to give birth; men weren’t allowed in the delivery room. Yoga and meditation were totally weird. Now mindfulness is mainstream and everyone does yoga, and there are birthing centers and hospices all over. We’ve integrated all these things into our culture. And now I think we’re ready to integrate psychedelics.” He also points out that many of the people in charge of our institutions today have personal experience with psychedelics and so feel less threatened by them.

Bossis would like to believe in Doblin’s sunny forecast, and he hopes that “the legacy of this work” will be the routine use of psychedelics in palliative care. But he also thinks that the medical use of psychedelics could easily run into resistance. “This culture has a fear of death, a fear of transcendence, and a fear of the unknown, all of which are embodied in this work.” Psychedelics may be too disruptive for our society and institutions ever to embrace them.

The first time I raised the idea of “the betterment of well people” with Roland Griffiths, he shifted in his chair and chose his words carefully. “Culturally, right now, that’s a dangerous idea to promote,” he said. And yet, as we talked, it became clear that he, too, feels that many of us stand to benefit from these molecules and, even more, from the spiritual experiences they can make available.

“We are all terminal,” Griffiths said. “We’re all dealing with death. This will be far too valuable to limit to sick people.” ♦

Post Reply