New Evidence Suggests Parkinson Starts in The Gut – Not The Brain.

Join Us : facebook.com/unitedhumanists

New Evidence Suggests Parkinson Starts in The Gut – Not The Brain.

Scientists have found more new evidence that Parkinson’s could start in the gut before spreading to the brain, observing lower rates of the disease in patients who had undergone a procedure called a truncal vagotomy.

The operation removes sections of the vagus nerve – which links the digestive tract with the brain – and over the course of a five-year study, patients who had this link completely removed were 40 percent less likely to develop Parkinson’s than those who hadn’t.

According to a team led by Bojing Liu from the Karolinska Instituet in Sweden, that’s a significant difference, and it backs up earlier work linking the development of the brain disease to something happening inside our bellies.

If we can understand more about how this link operates, we might be better able to stop it.

“These results provide preliminary evidence that Parkinson’s disease may start in the gut,” says Liu.

“Other evidence for this hypothesis is that people with Parkinson’s disease often have gastrointestinal problems such as constipation, that can start decades before they develop the disease.”

The vagus nerve helps control various unconscious processes like heart rate and digestion, and resecting parts of it in a vagotomy is usually done to remove an ulcer if the stomach is producing a dangerous level of acid.

For this study, the researchers looked at 40 years of data from Swedish national registers, to compare 9,430 people who had a vagotomy against 377,200 people from the general population who hadn’t.

The likelihood of people in these two groups to develop Parkinson’s was statistically similar at first – until the researchers looked at the type of vagotomy that had been carried out on the smaller group.

In total, 19 people (just 0.78 percent of the sample) developed Parkinson’s more than five years after a truncal (complete) vagotomy, compared to 60 people (1.08 percent) who had a selective vagotomy.

Compare that to the 3,932 (1.15 percent) of people who had no surgery and developed Parkinson’s after being monitored for at least five years, and it seems clear that the vagus nerve is playing some kind of role here.

So what’s going on here? One hypothesis the scientists put forward is that gut proteins start folding in the wrong way, and that genetic ‘mistake’ gets carried up to the brain somehow, with the mistake being spread from cell to cell.

Parkinson’s develops as neurons in the brain get killed off, leading to tremors, stiffness, and difficulty with movement – but scientists aren’t sure how it’s caused in the first place. The new study gives them a helpful tip about where to look.

The latest research isn’t alone in its conclusions. Last year, tests on miceshowed links between certain mixes of gut bacteria and a greater likelihood of developing Parkinson’s.

What’s more, earlier this year a study in the US identified differences between the gut bacteria of those with Parkinson’s compared with those who didn’t have the condition.

All of this is useful for scientists looking to prevent Parkinson’s, because if we know where it starts, we can block off the source.

But we shouldn’t get ahead of ourselves – as the researchers behind the new study point out, Parkinson’s is complex condition, and they weren’t able to include controls for all potential factors, including caffeine intake and smoking.

It’s also worth noting that Parkinson’s is classed as a syndrome: a collection of different but related symptoms that may have multiple causes.

“Much more research is needed to test this theory and to help us understand the role this may play in the development of Parkinson’s,” says Lui.

The research has been published in Neurology.

Source : https://bit.ly/2DjS7jN

Advertisements

France wants mandatory vaccinations as it is ‘unacceptable children are still dying of measles’

Join Us : facebook.com/unitedhumanists

France wants mandatory vaccinations as it is ‘unacceptable children are still dying of measles’

Parents in France will be legally obliged to vaccinate their children from 2018, the government has announced.

French Prime Minister Édouard Philippe said it was “unacceptable” that children are “still dying of measles” in the country where some of the earliest vaccines were pioneered.

Three childhood vaccines, for diphtheria, tetanus and polio, are currently mandatory in France. Others, including those against hepatitis and whooping cough, are simply recommended.

Announcing the policy, Mr Philippe evoked the name of Louis Pasteur, the French biologist who made breakthroughs in disease research and developed the first vaccines for rabies and anthrax in the 19th century.

He said all the vaccines which are universally recommended by health authorities – 11 in total – would be compulsory.

The move follows a similar initiative in Italy, which recently banned non-vaccinated children from attending state schools.

The World Health Organisation has warned of major measles outbreaks spreading across Europe despite the availability of a safe, effective vaccine.

Anti-vaccine movements, whose followers are known as anti-vaxxers, are believed to have contributed to low rates of immunisation against the highly contagious disease in a number of countries.

A recent survey found more than three out of 10 French people don’t trust vaccines, with just 52 per cent of participants saying the benefits of vaccination outweigh the risks.

There were 79 cases of measles reported in France in the first two months of 2017, mostly due to an outbreak of 50 cases in the north-eastern Lorraine region, according to the European Centre for Disease Prevention and Control.

Between the beginning of 2008 and the end of 2016, more than 24,000 cases of measles were declared in France, official figures show. Of these, around 1,500 had serious complications and there were 10 deaths.

Vaccination is not mandatory in Britain, and around 24,000 children a year in England are not immunised against measles, mumps and rubella.

Fear surrounding the combined inoculation for the three infectious diseases, known as the MMR vaccine, stems in part from a discredited study claiming to show a link between the jab and autism.

The paper, published in medical journal The Lancet nearly 20 years ago by disgraced former doctor Andrew Wakefield, led to a heavy fall in uptake among parents at the time, but exhaustive scientific research has now disproved the theory.

Two children in the UK have died of measles since 2006, and in 2013 a young man from Wales died of the disease – all a “waste of life,”  Dr Farah Jameel told doctors at the British Medical Association (BMA) annual meeting last month.

The BMA is calling for evidence to be submitted to the UK Government on “the potential advantages and disadvantages of childhood immunisation made mandatory under the law”.

Source : https://ind.pn/2sBs6Gu

Why Power Women Are Micro-Dosing LSD at Work.

Join Us : facebook.com/unitedhumanists

Why Power Women Are Micro-Dosing LSD at Work.

Karen Smith has a lot on her mind. The 32-year-old lives in Chicago, where, after working for tech startups abroad for 10 years, she moved last year with her husband to attend a graduate program in data science. On top of her academic studies, Smith works 10 to 30 hours a week as the data guru for a consulting firm. But late last year, what was really bringing her down was the bleak Midwest winters. That, and she’d recently cut out her daily habit of cigarettes and marijuana, frustrated that she’d become so dependent on pot to manage her mood. She needed something to take the edge off.

Smith—whose husband was also feeling low and looking for relief—had an idea, something she’d run across on Reddit. After some research, her husband bought psilocybin (psychedelic) mushrooms from a friend, ground them up with a Cuisinart spice grinder, and separated them into gel capsules ordered from Amazon. The dosage was precisely measured and precisely tiny: 10 micrograms for Smith’s husband and about half of that for her, which is just below the threshold of what would normally make a user “trip.” She took the homemade pill with a glass of water and waited. A few days later, she swallowed another one.

For the rest of the winter and into the spring, Smith (not her real name—she’s concerned that the illegality of her self-medication could compromise her career) and her husband continued to take tiny doses of magic mushrooms every few days while going about their daily lives. Smith didn’t see swirling wild colors or shifting shapes. She didn’t feel as if the trees and sky were sparkling magically at her. She didn’t imagine that she saw God. Instead—along with shaking off those winter blues—she became very, very efficient. “It gives you fresh eyes,” she says, “for programming or figuring out algorithmic stuff. It made me really productive in a motivated way. Whatever mental block that was stopping me from doing something would disappear.” Plus, during her four-month-long mushroom experiment, she got a lot of household chores done.

The term for what Smith and her husband were trying is “micro-dosing,” a growing trend in psychotropic experimentation. Unlike other trending hallucinogenic experiences, like, say, drinking ayahuasca (a psychedelic tea brewed from Amazonian plants, sipped under the supervision of a shaman), micro-dosing doesn’t deliver an earth-shattering, body-wrenching, mind-blowing journey through the other side of the Doors of Perception. The idea is to change, in an almost imperceptible way, your everyday neural functioning for the better.

While it’s impossible to gather hard data on micro-dosing, anecdotal evidence suggests that its use is on the rise: The popular podcast Reply All devoted a segment to it last fall; Rolling StoneVICE, and Forbes chronicled it as a trend shortly afterward; and one YouTube how-to tutorial has been streamed more than half a million times since it was posted in September 2015. Reddit, where Smith picked up the idea, has an entire subReddit devoted to the topic with more than 9,000 subscribers. Tech insiders in particular seem eager to try it out as an alternative to Adderall (the prescription stimulant, prescribed to treat ADD/ADHD, that helps users stay motivated and on task, but may cause irritability and anxiety)—one that helps not just with efficacy and focus, but also with creativity. The women who try micro-dosing aren’t burnouts; in fact, the ones we spoke to are high-achieving, and interested in becoming more so.

 

Women like New York Times best-selling author Ayelet Waldman. The writer and former drug-policy lawyer (and wife of author Michael Chabon) suffered for years from PMDD (premenstrual dysphoric disorder), a severe form of PMS that mimics depression, which she was treating with SSRIs (antidepressants) timed to the week before her period. But when the Berkeley, California-based Waldman, 52, hit perimenopause, her periods became far less predictable, and she began to hunt around for other options to manage her moods, which is how she began micro-dosing as a one-month experiment, despite her self-confessed aversion to drugs of that sort.

“I thought if there was one human being in the world destined to have a bad trip, it was Ayelet Waldman,” she says. “I mean, I could have a bad trip over breakfast. I don’t need a drug for that.” But she’d begun to realize that the legal drugs she’d been prescribed for years had plenty of drawbacks: “There was a study published about Ambien and Alzheimer’s long after I’d taken a thousand Ambien.”

Before trying her experiment, Waldman conducted extensive research into the myths and realities surrounding LSD. (Perhaps the most encouraging fact of all: “LSD is, as drugs go, safe. In terms of morbidity, it’s a lot more like marijuana than heroin,” according to her research.) She also corresponded with Menlo Park, California-based psychologist James Fadiman, Ph.D., whose chapter on micro-dosing in his 2011 underground classic, The Psychedelic Explorer’s Guide, meant to be a practical guide to psychedelics, introduced the term to the mainstream of drug culture (if not yet the mainstream itself). Fadiman explained exactly how to microdose and how he developed his method. Waldman was thrilled with the results: She regulated her own moods better and worked through marital bumps more easily. Her children—whom she told only that she was trying a new medication—gave her experiment glowing reviews. “I didn’t fly off the handle as much,” she says. “I wrote a whole book called Bad Mother [which was published in May 2009]. If I had been micro-dosing back then, I probably would have written Remarkably Calm, Compassionate Mother.”

What really surprised Waldman was the way it affected her work. “I found it inspired a state of calm hypomania. It was a flow but without the Adderall irritability. You lose track of time because you’re so into the work, and you’re making all these exciting connections.” Most tellingly, says Waldman, is that, “I wrote a book in a month!” She turned her journal and research on micro-dosing into A Really Good Day: How Microdosing Made a Mega Difference in My Mood, My Marriage, and My Life (which will be published in January by Knopf ).

Source : https://bit.ly/2SXGnc1

Did fruit, not friends, give us big brains?

Join Us : facebook.com/unitedhumanists

Did fruit, not friends, give us big brains?

Diet, not social life, may be the driver of brain size evolution, a new study suggests.

The findings call into question “the social brain hypothesis,” which argues that humans and other primates are big-brained because of their sociality.

The findings, which appear in the journal Nature Ecology and Evolution, reinforce the notion that both human and non-human primate brain evolution may be driven by differences in feeding rather than in socialization.

“Are humans and other primates big-brained because of social pressures and the need to think about and track our social relationships, as some have argued?” asks James Higham, assistant professor of anthropology at New York University.

“This has come to be the prevailing view, but our findings do not support it—in fact, our research points to other factors, namely diet.”

“Complex foraging strategies, social structures, and cognitive abilities, are likely to have co-evolved throughout primate evolution,” adds Alex DeCasien, a doctoral candidate and lead author of the study in the journal Nature Ecology and Evolution.

“However, if the question is: ‘Which factor, diet or sociality, is more important when it comes to determining the brain size of primate species?’ then our new examination suggests that factor is diet.”

The social brain hypothesis sees social complexity as the primary driver of primate cognitive complexity, suggesting that social pressures ultimately led to the evolution of the large human brain.

While some studies have shown positive relationships between relative brain size and group size, other studies which examined the effects of different social or mating systems have revealed highly conflicting results, raising questions about the strength of the social brain hypothesis.

In the new study, researchers, including Scott Williams, an assistant professor of anthropology, examined more than 140 primate species—or more than three times as many as previous studies—and incorporated more recent evolutionary trees, or phylogenies.

They took into account food consumption across the studied species—folivores (leaves), frugivores (fruit), frugivores/folivores, and omnivores (addition of animal protein)—as well as several measures of sociality, such as group size, social system, and mating system.

The findings show that brain size is predicted by diet rather than by the various measures of sociality—after controlling for body size and phylogeny. Notably, frugivores and frugivore/folivores exhibit significantly larger brains than folivores and, to a lesser extent, omnivores show significantly larger brains than folivores.

The results don’t necessarily reveal an association between brain size and fruit or protein consumption on a within-species level; rather, they are evidence of the cognitive demands required by different species to obtain certain foods.

“Fruit is patchier in space and time in the environment, and the consumption of it often involves extraction from difficult-to-reach-places or protective skins,” DeCasien says. “Together, these factors may lead to the need for relatively greater cognitive complexity and flexibility in frugivorous species.”

Source : https://bit.ly/2MmZvhq

Are We Living in The Post-Truth Era?

Join Us : facebook.com/unitedhumanists

Are We Living in The Post-Truth Era?

We are repeatedly told these days that we have entered the terrifying new era of post-truth, in which not just particular facts but entire histories might be faked. But if this is the era of post-truth, when, exactly, was the halcyon age of truth? And what triggered our transition to the post-truth era? The internet? Social media? The rise of Putin and Trump?

A cursory look at history reveals that propaganda and disinformation are nothing new. In fact, humans have always lived in the age of post-truth. Homo sapiens is a post-truth species, who conquered this planet thanks above all to the unique human ability to create and spread fictions. We are the only mammals that can cooperate with numerous strangers because only we can invent fictional stories, spread them around, and convince millions of others to believe in them. As long as everybody believes in the same fictions, we all obey the same laws and can thereby cooperate effectively.

Please note that I am not denying the effectiveness or potential benevolence of religion — just the opposite. Fiction is among the most effective tools in humanity’s tool kit.

Centuries ago, millions of Christians locked themselves inside a self-reinforcing mythological bubble, never daring to question the factual veracity of the Bible, while millions of Muslims put their unquestioning faith in the Quran. We have zero scientific evidence that Eve was tempted by the serpent, that the souls of all infidels burn in hell after they die, or that the creator of the universe doesn’t like it when a Brahmin marries a Dalit — yet billions of people have believed in these stories for thousands of years.

Some fake news lasts forever.

I am aware that many people might be upset by my equating religion with fake news, but that’s exactly the point. When a thousand people believe some made-up story for one month, that’s fake news. When a billion people believe it for a thousand years, that’s a religion, and we are admonished not to call it “fake news” in order not to hurt the feelings of the faithful (or incur their wrath).

Please note that I am not denying the effectiveness or potential benevolence of religion — just the opposite. For better or worse, fiction is among the most effective tools in humanity’s tool kit. By bringing people together, religious creeds make large-scale human cooperation possible. They inspire people to build hospitals, schools and bridges in addition to armies and prisons. Much of the Bible may be fictional, but it can still bring joy to billions and can still encourage humans to be compassionate, courageous, and creative— just like other great works of fiction, such as Don Quixote, War and Peace and the Harry Potter books.

Again, some people might be offended by my comparison of the Bible to Harry Potter. If you are a scientifically minded Christian, you might argue that the holy book was never meant to be read as a factual account, but rather as a metaphorical story containing deep wisdom. But isn’t that true of the Harry Potter stories too?

Ancient religions have not been the only ones to use fiction to cement cooperation. More recently, each nation has created its own national mythology.

Of course, not all religious myths have been beneficent. On August 29, 1255, the body of a nine-year-old English boy called Hugh was found in a well in the town of Lincoln. Rumor quickly spread that Hugh had been ritually murdered by the local Jews. The story only grew with retelling, and one of the most renowned English chroniclers of the day, Matthew Paris, provided a detailed and gory description of how prominent Jews from throughout England gathered in Lincoln to fatten up, torture, and finally crucify the abandoned child. Nineteen Jews were tried and executed for the alleged murder. Similar blood libels became popular in other English towns, leading to a series of pogroms in which whole Jewish communities were massacred. Eventually, in 1290, the entire Jewish population of England was expelled.

The story doesn’t end there. A century after the expulsion of the Jews, Geoffrey Chaucer included a blood libel modeled on the story of Hugh of Lincoln in the Canterbury Tales (“The Prioress’s Tale”). The tale culminates with the hanging of the Jews. Similar blood libels subsequently became a staple of every anti-Semitic movement from late medieval Spain to modern Russia.

Hugh of Lincoln was buried in Lincoln Cathedral and venerated as a saint. He was reputed to perform various miracles, and his tomb continued to draw pilgrims even centuries after the expulsion of all Jews from England. Only in 1955 — ten years after the Holocaust — did Lincoln Cathedral repudiate the blood libel story, placing a plaque near Hugh’s tomb that reads:

“Trumped-up stories of “ritual murders” of Christian boys by Jewish communities were common throughout Europe during the Middle Ages and even much later. These fictions cost many innocent Jews their lives. Lincoln had its own legend and the alleged victim was buried in the Cathedral in the year 1255. Such stories do not redound to the credit of Christendom.”

Well, some fake news only lasts seven hundred years.

Ancient religions have not been the only ones to use fiction to cement cooperation. In more recent times, each nation has created its own national mythology, while movements such as communism, fascism and liberalism fashioned elaborate self-reinforcing credos. Joseph Goebbels, the Nazi propaganda maestro, allegedly explained his method thus: “A lie told once remains a lie, but a lie told a thousand times becomes the truth.” In Mein KampfHitler wrote, “The most brilliant propagandist technique will yield no success unless one fundamental principle is borne in mind constantly — it must confine itself to a few points and repeat them over and over.” Can any present-day fake-news peddler improve on that?

The truth is, truth has never been high on the agenda of Homo sapiens. If you stick to unalloyed reality, few people will follow you.

Commercial firms also rely on fiction and fake news. Branding often involves retelling the same fictional story again and again, until people become convinced it is the truth. What images come to mind when you think about Coca-Cola? Do you think about healthy young people engaging in sports and having fun together? Or do you think about overweight diabetes patients lying in a hospital bed? Drinking lots of Coca-Cola will not make you young, will not make you healthy, and will not make you athletic — rather, it will increase your chances of suffering from obesity and diabetes. Yet for decades Coca-Cola has invested billions of dollars in linking itself to youth, health, and sports — and billions of humans subconsciously believe in this linkage.

The truth is, truth has never been high on the agenda of Homo sapiens. If you stick to unalloyed reality, few people will follow you. False stories have an intrinsic advantage over the truth when it comes to uniting people. If you want to gauge group loyalty, requiring people to believe an absurdity is a far better test than asking them to believe the truth. If the chief says the sun rises in the west and sets in the east, only true loyalists will clap their hands. Similarly, if all your neighbors believe the same outrageous tale, you can count on them to stand together in times of crisis. If they are willing to believe only accredited facts, what does that prove?

You might argue that in some cases it is possible to organize people effectively through consensual agreement rather than through fictions. In the economic sphere, money and corporations bind people together far more effectively than any god or holy book, even though they are just a human convention. In the case of a holy book, a true believer would say, “I believe that the book is sacred,” while in the case of the dollar, a true believer would say only, “I believe that other people believe that the dollar is valuable.” It is obvious that the dollar is just a human creation, yet people all over the world respect it. If so, why can’t humans abandon all myths and fictions and organize themselves on the basis of consensual conventions such as the dollar?

Yet the difference between holy books and money is far smaller than it might seem. When most people see a dollar bill, they forget that it is just a human convention. As they see the green piece of paper with the picture of the dead white man, they see it as something valuable in and of itself. They hardly ever remind themselves, “Actually, this is a worthless piece of paper, but because other people view it as valuable, I can make use of it.” If you observed a human brain in an fMRI scanner, you would see that as someone is presented with a suitcase full of hundred-dollar bills, the parts of the brain that start buzzing with excitement are not the skeptical parts but the greedy parts. Conversely, in the vast majority of cases people begin to sanctify the Bible or the Vedas only after long and repeated exposure to others who view it as sacred. We learn to respect holy books in exactly the same way we learn to respect paper currency.

You cannot play games or read novels unless you suspend disbelief. To enjoy soccer, you have to forget for at least ninety minutes that its rules are merely human inventions.

For this reason there is no strict division in practice between knowing that something is just a human convention and believing that something is inherently valuable. In many cases, people are ambiguous or forgetful about this division. To give another example, in a deep philosophical discussion about it, almost everybody would agree that corporations are fictional stories created by human beings. Microsoft isn’t the buildings it owns, the people it employs, or the shareholders it serves — rather, it is an intricate legal fiction woven by lawmakers and lawyers. Yet 99 percent of the time, we aren’t engaged in deep philosophical discussions, and we treat corporations as if they are real entities, just like tigers or humans.

Blurring the line between fiction and reality can be done for many purposes, starting with “having fun” and going all the way to “survival.” You cannot play games or read novels unless you suspend disbelief. To really enjoy soccer, you have to accept the rules and forget for at least ninety minutes that they are merely human inventions. If you don’t, you will think it utterly ridiculous for 22 people to go running after a ball. Soccer might begin with just having fun, but it can become far more serious stuff, as any English hooligan or Argentinian nationalist will attest. Soccer can help formulate personal identities, it can cement large-scale communities, and it can even provide reasons for violence.

Humans have a remarkable ability to know and not know at the same time. Or, more correctly, they can know something when they really think about it, but most of the time they don’t think about it, so they don’t know it. If you really focus, you realize that money is fiction. But you usually don’t think about it. If you are asked about it, you know that soccer is a human invention. But in the heat of a match, nobody asks. If you devote the time and energy, you can discover that nations are elaborate yarns. But in the midst of a war, you don’t have the time and energy.

Scholars throughout history have faced this dilemma: Should they aim to unite people by making sure everyone believes the same story, or should they let people know the truth even at the price of disunity?

Truth and power can travel together only so far. Sooner or later they go their separate paths. If you want power, at some point you will have to spread fictions. If you want to know the truth about the world, at some point you will have to renounce power. You will have to admit things — for example, about the sources of your own power — that will anger allies, dishearten followers, or undermine social harmony.

Scholars throughout history have faced this dilemma: Do they serve power or truth? Should they aim to unite people by making sure everyone believes in the same story, or should they let people know the truth even at the price of disunity? The most powerful scholarly establishments — whether of Christian priests, Confucian mandarins or Communist ideologues — placed unity above truth. That’s why they were so powerful.

As a species, humans prefer power to truth. We spend far more time and effort on trying to control the world than on trying to understand it — and even when we try to understand it, we usually do so in the hope that understanding the world will make it easier to control it. If you dream of a society in which truth reigns supreme and myths are ignored, you have little to expect from Homo sapiens. Better to try your luck with chimps.

Source : http://bit.ly/2SvgXWv

Astrophysicist Janna Levin Reads Maya Angelou’s Stunning Humanist Poem That Flew to Space, Inspired by Carl Sagan.

Join us : facebook.com/unitedhumanists

A Brave and Startling Truth: Astrophysicist Janna Levin Reads Maya Angelou’s Stunning Humanist Poem That Flew to Space, Inspired by Carl Sagan.

The second annual Universe in Verse — a celebration of science through poetry, and a voice of resistance against the assault on nature — opened with the poem “A Brave and Startling Truth” by Maya Angelou (April 4, 1928–May 28, 2014), which flew to space on the Orionspacecraft. I chose this poem to set the tone for the show in part because it is absolutely stunning and acutely relevant to our cultural moment, and in part because the first time I read it, it sparked in me a sudden insight into the often invisible ways in which science and poetry influence and inspire one another — into how the golden threads of thought and feeling stretch and cross-hatch across disciplines to weave what we call culture.

Angelou composed the poem for the 50th anniversary of the United Nations in 1995. In 1994, Carl Sagan delivered a beautiful speech at Cornell University, inspired by the Voyager’s landmark photograph of Earth seen for the very first time from the outer reaches of the Solar System — a now-iconic image the spacecraft took on Sagan’s spontaneous insistence before shutting off the cameras upon completion of the planned mission to photograph the outer planets.

In describing what the Voyager captured in that grainy photograph of mostly empty space, Sagan limned Earth as a “pale blue dot.” That became the moniker of the photograph itself and the title of his bestselling book published later that year, in which he wrote that “everyone you love, everyone you know, everyone you ever heard of, every human being who ever was, lived out their lives” on this “mote of dust suspended in a sunbeam.”

This poetic phrase imprinted itself on the popular imagination and permeated culture in the months following the book’s publication — the months during which Angelou was composing her poem. Like all great poets, she was extremely precise and deliberate about her word choice. Mote is a rather peculiar word, particularly in this cosmic context, and I can’t help but think that by using the phrase “mote of matter” in the final stanzas, Angelou was paying tribute to Sagan and to the message of the Voyager — a message about our place in the cosmic order not as something separate from and superior to nature, but as a tiny pixel-part of it, imbued with equal parts humility and responsibility.

Reading the poem at The Universe in Verse is astrophysicist Janna Levin — a recent performer of some beautiful poetry and a member, alongside Sagan, of the tiny peer group of working scientists who write about science with uncommon poetic might. Please enjoy:

A BRAVE AND STARTLING TRUTH

We, this people, on a small and lonely planet
Traveling through casual space
Past aloof stars, across the way of indifferent suns
To a destination where all signs tell us
It is possible and imperative that we learn
A brave and startling truth

And when we come to it
To the day of peacemaking
When we release our fingers
From fists of hostility
And allow the pure air to cool our palms

When we come to it
When the curtain falls on the minstrel show of hate
And faces sooted with scorn are scrubbed clean
When battlefields and coliseum
No longer rake our unique and particular sons and daughters
Up with the bruised and bloody grass
To lie in identical plots in foreign soil

When the rapacious storming of the churches
The screaming racket in the temples have ceased
When the pennants are waving gaily
When the banners of the world tremble
Stoutly in the good, clean breeze

When we come to it
When we let the rifles fall from our shoulders
And children dress their dolls in flags of truce
When land mines of death have been removed
And the aged can walk into evenings of peace
When religious ritual is not perfumed
By the incense of burning flesh
And childhood dreams are not kicked awake
By nightmares of abuse

When we come to it
Then we will confess that not the Pyramids
With their stones set in mysterious perfection
Nor the Gardens of Babylon
Hanging as eternal beauty
In our collective memory
Not the Grand Canyon
Kindled into delicious color
By Western sunsets

Nor the Danube, flowing its blue soul into Europe
Not the sacred peak of Mount Fuji
Stretching to the Rising Sun
Neither Father Amazon nor Mother Mississippi who, without favor,
Nurture all creatures in the depths and on the shores
These are not the only wonders of the world

When we come to it
We, this people, on this minuscule and kithless globe
Who reach daily for the bomb, the blade and the dagger
Yet who petition in the dark for tokens of peace
We, this people on this mote of matter
In whose mouths abide cankerous words
Which challenge our very existence
Yet out of those same mouths
Come songs of such exquisite sweetness
That the heart falters in its labor
And the body is quieted into awe

We, this people, on this small and drifting planet
Whose hands can strike with such abandon
That in a twinkling, life is sapped from the living
Yet those same hands can touch with such healing, irresistible tenderness
That the haughty neck is happy to bow
And the proud back is glad to bend
Out of such chaos, of such contradiction
We learn that we are neither devils nor divines

When we come to it
We, this people, on this wayward, floating body
Created on this earth, of this earth
Have the power to fashion for this earth
A climate where every man and every woman
Can live freely without sanctimonious piety
Without crippling fear

When we come to it
We must confess that we are the possible
We are the miraculous, the true wonder of this world
That is when, and only when
We come to it.

“A Brave and Startling Truth” was published in a commemorative booklet in 1995 and was later included in Maya Angelou: The Complete Poetry (public library).

More highlights from the second annual Universe in Verse will be released at here over the coming weeks and months. For some high points of the inaugural event, see Levin’s exquisite reading of Adrienne Rich’s tribute to women in astronomy and U.S. Poet Laureate Tracy K. Smith’s ode to the Hubble Space Telescope, then savor the complete show for a two-hour poetic serenade to science.

Source: http://bit.ly/2UaFkpx

What Is The “true” Human Diet?

Join us : facebook.com/unitedhumanists.com

What Is The “true” Human Diet?

People have been debating the natural human diet for thousands of years, often framed as a question of the morality of eating other animals. The lion has no choice, but we do. Take the ancient Greek philosopher Pythagoras, for example: “Oh, how wrong it is for flesh to be made from flesh!” The argument hasn’t changed much for ethical vegetarians in 2,500 years, but today we also have Sarah Palin, who wrote in Going Rogue: An American Life, “If God had not intended for us to eat animals, how come He made them out of meat?” Have a look at Genesis 9:3—“Every moving thing that liveth shall be meat for you.”

While humans don’t have the teeth or claws of a mammal evolved to kill and eat other animals, that doesn’t mean we aren’t “supposed” to eat meat, though. Our early Homo ancestors invented weapons and cutting tools in lieu of sharp carnivorelike teeth. There is no explanation other than meat eating for the fossil animal bones riddled with stone tool cut marks at fossil sites. It also explains our simple guts, which look little like those evolved to process large quantities of fibrous plant foods.

But gluten isn’t unnatural either. Despite the pervasive call to cut carbs, there is plenty of evidence that cereal grains were staples, at least for some, long before domestication. People at Ohalo II on the shore of the Sea of Galilee ate wheat and barley during the peak of the last ice age, more than 10,000 years before these grains were domesticated. Paleobotanists have even found starch granules trapped in the tartar on 40,000-year-old Neandertal teeth with the distinctive shapes of barley and other grains and the telltale damage that comes from cooking. There is nothing new about cereal consumption.

This leads us to the so-called Paleolithic Diet. As a paleoanthropologist I’m often asked for my thoughts about it. I’m not really a fan—I like pizza and French fries and ice cream too much. Nevertheless, diet gurus have built a strong case for discordance between what we eat today and what our ancestors evolved to eat. The idea is that our diets have changed too quickly for our genes to keep up, and the result is said to be “metabolic syndrome,” a cluster of conditions that include elevated blood pressure, high blood sugar level, obesity and abnormal cholesterol levels. It’s a compelling argument. Think about what might happen if you put diesel in an automobile built for regular gasoline. The wrong fuel can wreak havoc on the system, whether you’re filling a car or stuffing your face.

It makes sense, and it’s no surprise that Paleolithic diets remain hugely popular. There are many variants on the general theme, but foods rich in protein and omega-3 fatty acids show up again and again. Grass-fed cow meat and fish are good, and carbohydrates should come from nonstarchy fresh fruits and vegetables. On the other hand, cereal grains, legumes, dairy, potatoes, and highly refined and processed foods are out. The idea is to eat like our Stone Age ancestors—you know, spinach salads with avocado, walnuts, diced turkey, and the like.

I am not a dietician and cannot speak with authority about the nutritional costs and benefits of Paleolithic diets, but I can comment on their evolutionary underpinnings. From the standpoint of paleoecology, the Paleolithic diet is a myth. Food choice is as much about what is available to be eaten as it is about what a species evolved to eat. And just as fruits ripen, leaves flush and flowers bloom predictably at different times of the year, foods available to our ancestors varied over deep time as the world changed around them from warm and wet to cool and dry and back again. Those changes are what drove our evolution.

Even if we could reconstruct the precise nutrient composition of foods eaten by a particular hominin species in the past (and we can’t), the information would be meaningless for planning a menu based on our ancestral diet. Because our world was ever changing, so, too, was the diet of our ancestors. Focusing on a single point in our evolution would be futile. We’re a work in progress. Hominins were spread over space, too, and those living in the forest by the river surely had a different diet from their cousins on the lakeshore or the open savanna.

What was the ancestral human diet? The question itself makes no sense. Consider some of the recent hunter-gatherers who have inspired Paleolithic diet enthusiasts. The Tikiġaġmiut of the north Alaskan coast lived almost entirely on the protein and fat of marine mammals and fish, whereas the Gwi San in Botswana’s Central Kalahari took something like 70 percent of their calories from carbohydrate-rich, sugary melons and starchy roots. Traditional human foragers managed to earn a living from the larger community of life that surrounded them in a remarkable variety of habitats, from near-polar latitudes to the tropics. Few other mammalian species can make that claim, and there is little doubt that dietary versatility has been key to the success we’ve had.

Many paleoanthropologists today believe that increasing climate fluctuation through the Pleistocene sculpted our ancestors—whether their bodies or their wit, or both—for the dietary flexibility that has become a hallmark of humanity. The basic idea is that our ever changing world winnowed out the pickier eaters among us. Nature has made us a versatile species, which is why we can find something to satiate us on nearly all its myriad biospheric buffet tables. It’s also why we have been able to change the game, transition from forager to farmer, and really begin to consume our planet.

Source: https://bit.ly/2oDhXGB