If you try to play in this style, you’ll need patience and a willingness to experiment, sometimes with little hope of achieving much in the short run. There are no easy shortcuts. Existing banjo arrangements almost never translate onto the guitar in any straightforward way. It has taken me years to make the style work on the guitar, and I am still learning. I do it because I fell in love with Old Time fiddle music; I don’t fiddle, and have no hope of learning to play like Bruce Molsky. Again, the point is that Old Time fiddle music, like Celtic music, has a life of its own, outside of the guitar; the music should shape the guitar playing as far as is possible (given one’s own limitations and those of the instrument), not vice-versa.
Thomas believes that this naturalist tradition is also why Europe is acting much faster than other places — for example, the United States — to address the decline of insects: Interest leads to tracking, which leads to awareness, which leads to concern, which leads to action. Since the Krefeld data emerged, there have been hearings about protecting insect biodiversity in the German Bundestag and the European Parliament. European Union member states voted to extend a ban on neonicotinoid pesticides and have begun to put money toward further studies of how abundance is changing, what is causing those changes and what can be done. When I knocked on the door of de Kroon’s office, at Radboud University in the Dutch city Nijmegen, he was looking at some photos from another meeting he had that day: Willem-Alexander, the king of the Netherlands, had taken a tour of the city’s efforts to make its riverside a friendlier habitat for bugs.
Stemming insect declines will require much more than this, however. The European Union already had some measures in place to help pollinators — including more strictly regulating pesticides than the United States does and paying farmers to create insect habitats by leaving fields fallow and allowing for wild edges alongside cultivation — but insect populations dropped anyway. New reports call for national governments to collaborate; for more creative approaches such as integrating insect habitats into the design of roads, power lines, railroads and other infrastructure; and, as always, for more studies. The necessary changes, like the causes, may be profound. “It’s just another indication that we’re destroying the life-support system of the planet,” Lister says of the Puerto Rico study. “Nature’s resilient, but we’re pushing her to such extremes that eventually it will cause a collapse of the system.”
In 1846 Melbourne was gripped by a panic: a story had spread that a white woman had been shipwrecked off the coast of Gippsland and was living with Aboriginal people. “Expeditions” were sent to “rescue” her. Messages were left for her printed on handkerchiefs, and because some believed she was Scottish, some of these were written in Gaelic.
The expeditions sent to Gippsland resulted in the massacre of large numbers of Indigenous people from the Gunai/Kurnai community.
For generations, people have argued over whether the “white woman” really existed and if so, what happened to her. In her 2001 book The Captive White Woman of Gipps Land author Julie Carr recounted a story written in 1897 by Mary Howitt, the daughter of A.W. Howitt, an anthropologist and Gippsland magistrate, which told how the white woman later had children with an Aboriginal husband and drowned in McLennan’s strait. Carr came to the conclusion that evidence for the existence of the woman was inconclusive; government searches in 1846 and 1847 having failed to find her.
But we have recently identified two short songs in the Aboriginal language of Gippsland (Gunai/Kurnai) about the white woman’s story that provide some clues. These were in the papers of Howitt at the State Library of Victoria.
Justin Wren returns to the mixed martial arts ring this weekend after a five-year hiatus during which he lived as one with the Pygmy people in Africa, whom he helped secure their own land and launch sustainable farming initiatives. He wrote this essay on his time in Africa and his reasons for getting back in the ring.
“Can you help give us a voice? We have none.”
The Pygmy chief looked up at me, trying to hide the urgency in his eyes.
Only two days earlier, I’d held the lifeless body of a Pygmy baby who had succumbed to water-borne disease and malnutrition. His name was Andibo. He’d been the last living relative in his family besides his mother. Now she was alone, sitting in the corner of the hut, listless and barely able to shed a tear from sheer starvation. I stayed and dug the grave for little Andibo until my hands blistered and bled – it was the least I could do for these destitute people, who were starving and suffering greatly.
I was on the tail end of my second trip to the Democratic Republic of the Congo, which is one of the world’s greatest resources for natural elements but still one of the poorest nations in the world. The Pygmies are on the lowest rung of society; they aren’t afforded any rights like citizens and have been enslaved by neighboring non-Pygmy tribes to work in their fields for generations upon generations. Pygmies are paid for their labor in clothes or with scraps of food, like two small bananas or a minnow or two for an entire family. One woman I met was paid with a small patch of goatskin hide, not to wear, but to eat!
To oversimplify: fast strategies (think “live fast, die young”) are well-adapted for unpredictable dangerous environments. Each organism has a pretty good chance of randomly dying in some unavoidable way before adulthood; the species survives by sheer numbers. Fast organisms should grow up as quickly as possible in order to maximize the chance of reaching reproductive age before they unpredictably die. They should mate with anybody around, to maximize the chance of mating before they unpredictably die. They should ignore their offspring, since they expect most offspring to unpredictably die, and since they have too many to take care of anyway. They should be willing to take risks, since the downside (death without reproducing) is already their default expectation, and the upside (becoming one of the few individuals to give birth to the 10,000 offspring of the next generation) is high.
Slow strategies are well-adapted for safer environments, or predictable complex environments whose intricacies can be mastered with enough time and effort. Slow strategy animals may take a long time to grow up, since they need to achieve mastery before leaving their parents. They might be very picky maters, since they have all the time in the world to choose, will only have a few children each, and need to make sure each of those children has the best genes possible. They should work hard to raise their offspring, since each individual child represents a substantial part of the prospects of their genetic line. They should avoid risks, since the downside (death without reproducing) would be catastrophically worse than default, and the upside (giving birth to a few offspring of the next generation) is what they should expect anyway.
Del Giudice asks: what if life history strategies differ not just across species, but across individuals of the same species? What if this theory applied within the human population?
In the past year, at least four fossil finds have been billed as overturning the story of human evolution. The 300,000-year-old Homo sapiens specimen from Jebel Irhoud, Morocco, is hailed as pushing back the age of our species by nearly 100,000 years. Similarly, discoveries in Israel (a fossilized jawbone), Saudi Arabia (a fossilized finger bone), and Siberia (several bone fragments) were each declared to be the oldest H. sapiens in their respective regions of the world. With each new find, researchers and news headlines announced that the fossils significantly altered our understanding of human evolution and dispersal from Africa. But if we have to rewrite the story of H. sapiens evolution so frequently, we might ask whether the plot we’re using is wrong to begin with.
The dominant, almost axiomatic paleoanthropological narrative holds that anatomically modern humans evolved in sub-Saharan Africa 200,000 years ago, and considers any fossil find relative to that framework. So, the Jebel Irhoud cranium from Morocco was reported as pushing back the origin of our species and making it a pan-African phenomenon (Hublin et al. 2017); the Israeli and Saudi specimens were said to push back the timing of the dispersal out of Africa (Hershkovitz et al. 2018; Groucutt et al. 2018); and the Siberian fossils were called the oldest modern humans outside of the Middle East and Africa (Siberian Times 2018), despite older findings (and similar headlines) from China, Laos, and Indonesia in the last 10 years. In essence, the data are subservient to the narrative that an entity known as anatomically modern humans exists and has a singular origin. Yet, this story ignores the complex fossil records of Asia and Australia and perpetuates a distinctly Eurocentric vision of our past.
The phrase “anatomically modern Homo sapiens” was first used in the 1970s to distinguish between Neanderthals and the European hominins who looked more like us. It wasn’t meant to establish a formal species boundary. But, in 1987, when Cann, Stoneking, and Wilson published their mitochondrial DNA study tracing all living humans back to a single ancestral population that lived in Africa around 200,000 years ago, molecular anthropology took on a new significance in the story of human evolution. After that, all non-African Middle and Late Pleistocene populations—including European Neanderthals as well as Homo erectus in Asia—were considered evolutionary dead ends. Despite the lack of consensus on this model among fossil experts and population geneticists, it has become the prevailing wisdom in a generation of anthropology textbooks and introductory lectures.
The Phoenician culture was one of the most prevailing and widespread in the history of the Mediterranean basin. From its rise in the northern Levant, the Phoenicians connected east and west for over a millennium through their established trade networks across the Mediterranean, reaching beyond the Straits of Gibraltar. One of their first western outposts was the city of Gadir, modern Cadiz, on the Atlantic coast of Spain1 believed to be initially settled around 1100 BCE, then became a fully-fledged Phoenician settlement by the end of the 9th century BCE2. Additional Phoenician settlements were established on the Balearic islands, Sardinia, Sicily, Malta, and Cyprus2. These Islands form a strategic arc across the northern Mediterranean allowing for island hopping from the Levantine homeland to the Iberian Peninsula and North African coast where they established their most dominant Western Mediterranean settlement in Carthage. The early Phoenician settlements in the Western Mediterranean are generally referred to as Western Phoenicia. From the middle of the sixth century BCE onwards, with the shift in Phoenician influence from the Levant to Carthage, they are typically referred to as Punic, and from the 6th century BCE onward this term has become synonymous to Phoenicians outside the Levant. Our previous research has shown that the Phoenician settlers who encountered indigenous communities already living on the islands that they settled were integrated to form the new Phoenician societies3.
Archaeological evidence on the largest of the Balearic Islands, Mallorca and Menorca, indicates continuous settlement since the 3rd millennium BCE. Limited evidence favours some level of continuity of prehistoric settlement in Ibiza from the end of Bronze Age until the arrival of Phoenicians in the 7th century BCE4. These Bronze Age settlements however, had relatively small populations compared to the larger Balearic Islands5,6,7. Ancient DNA analyses of human remains from early Ibizan sites can therefore provide evidence as to the origins of the Phoenician settlers of the island and the relationship between these early settlers and the modern population of the island. Further, when combined with other genetic data from Phoenician populations, analyses can provide key information about the process of Phoenician expansion and settlement of the Western Mediterranean.
Human beings have been discovered to have the capacity to directly perceive single photons of light, as has recently been established experimentally and published in the journal, Nature. This discovery is connected to what a number of leading physicists believe is another, even more astounding impending one – likely to be published in the next several months – the human capacity to directly perceive radical aspects of the quantum nature of light, especially superposition and quantum entanglement/non-locality.
Moreover, according to some of these leading physicists, some of the most important next steps in the progression of quantum physics and cosmology may actually depend on what trained human observers directly perceive in terms of the quantum properties of individual photons, especially regarding superposition and quantum entanglement.
Over a decade ago, in Bushell’s own research into the sensory-perceptual abilities of highly advanced, long-term, adept practitioners of special forms of observational meditation, he began to realize that some of these practitioners were actually specifically and explicitly attempting to study light with their own highly trained visual capacities, including attempting to perceive the most elementary, fundamental “partless particles” of light. In fact, they were in many ways following the same protocols that contemporary biophysicists and vision scientists employ for testing the human capacity for detecting the least amount of light. The basic protocol includes the following key factors: the need for a completely dark, virtually light-proof chamber, which produces in human vision what is called the dark-adapted scotopic condition; the need for relatively complete motionlessness, as movements can distract and distort perception; the need for extended periods of highly directed and sustained attention; the need for being able to engage in multiple trials of viewing light, i.e., training and learning of the task; the ability to discriminate between actual external sources of light and light spontaneously produced by the body, especially by the visual system itself (internally produced light phenomena known as phosphenes or biophotons).
Researchers found the oath among 130 ancient documents that were recently donated by the Kizu family, which was once a ninja clan in the town of Iga, near Kyoto. The oath likely was sent to Kizu’s relatives after his death, Yoshiki Takao, an associate professor with the International Ninja Research Center at Mie University in Japan, told AFP.
In the vow, Kizu pledged his thanks to the mentor who trained him in “ninjutsu,” or “the way of the ninja.” The term does not indicate the practice of a specific martial art, according to the Ninja Museum of Igaryuin Iga, Japan. Rather, ninjutsu is a type of “warfare art” specializing in espionage and strategic military attacks.
Popular representations of ninjas in movies, TV and comics emphasize swordplay as well as stealth. However, in reality, ninjas who relied on weapons were seen as “foolish” compared to ninjas who could gather enemy intelligence without being detected, according to the museum.
Kizu affirmed in the oath that he would never use his ninja skills for stealing — unless he was instructed to do so — and that he would never share what he learned with his closest relatives. Any new combat skills or weapons that he acquired, if they were not already known to other ninjas, would be duly reported to his superiors, according to the AFP.
Though it is generally agreed upon that the Greeks borrowed (and modified) the alphabet from the Phoenicians, there is no consensus about the moment when this took place. Over the years, several dates have been proposed, ranging from the 14th to the 8th/7th century bc. In classical studies the prevalent opinion is that the alphabet was introduced in or shortly before the 8th century bc, when the first attestations of Greek alphabetic writing appear. there are, however, quite a number of indications (from existing and new evidence) that plead for a much earlier date. In this article, a detailed analysis of the presently available archaeological, epigraphic and linguistic data will be presented to argue the case for an introduction in the 11th century bc at the latest.
“Sumerian is probably the last member of what must have been a large family of languages that goes back thousands and thousands of years,” says Irving Finkel, the curator in charge of the 130,000 cuneiform tablets stored at the British Museum. “Writing appeared in the world just in time to rescue Sumerian… We’re just lucky that we had some ‘microphone’ that picked it up before it went away with all the others.”
Finkel is one of the world’s leading cuneiform experts. In his book-filled office at the British Museum, he explains how the script was slowly deciphered thanks to a multi-lingual inscription about a king, just like the Rosetta Stone that helped researchers make sense of Egyptian hieroglyphs.
Like a brain, an ant colony operates without central control. Each is a set of interacting individuals, either neurons or ants, using simple chemical interactions that in the aggregate generate their behaviour. People use their brains to remember. Can ant colonies do that? This question leads to another question: what is memory? For people, memory is the capacity to recall something that happened in the past. We also ask computers to reproduce past actions – the blending of the idea of the computer as brain and brain as computer has lead us to take ‘memory’ to mean something like the information stored on a hard drive. We know that our memory relies on changes in how much a set of linked neurons stimulate each other; that it is reinforced somehow during sleep; and that recent and long-term memory involve different circuits of connected neurons. But there is much we still don’t know about how those neural events come together, whether there are stored representations that we use to talk about something that happened in the past, or how we can keep performing a previously learned task such as reading or riding a bicycle.
5G will substantially increase exposure to radiofrequency electromagnetic fields (RF-EMF) on top of the 2G, 3G, 4G, Wi-Fi, etc. for telecommunications already in place.
5G leads to massive increase of mandatory exposure to wireless radiation
Insects Are Vanishing
When most of us think of animals that should be saved from annihilation, near the top of any list are likely to be the stars of the animal world: tigers and polar bears, orcas and orangutans, elephants and rhinos, and other similarly charismatic creatures.
Few express similar concern or are likely to be willing to offer financial support to “save” insects. The few that are in our visible space and cause us nuisance, we regularly swat, squash, crush, or take out en masse with Roundup.
As it happens, though, of the nearly two million known species on this planet about 70% of them are insects. And many of them are as foundational to the food chain for land animals as plankton are for marine life. Harvard entomologist (and ant specialist) E.O. Wilson once observed that “if insects were to vanish, the environment would collapse into chaos.”
In fact, insects are vanishing.
Almost exactly a year ago, the first long-term study of the decline of insect populations was reported, sparking concern (though only in professional circles) about a possible “ecological Armageddon.” Based on data collected by dozens of amateur entomologists in 63 nature reserves across Germany, a team of scientists concluded that the flying insect population had dropped by a staggering 76% over a 27-year period. At the same time, other studiesbegan to highlight dramatic plunges across Europe in the populations of individual species of bugs, bees, and moths.
What could be contributing to such a collapse? It certainly is human-caused, but the factors involved are many and hard to sort out, including habitat degradation and loss, the use of pesticides in farming, industrial agriculture, pollution, climate change, and even, insidiously enough, “light pollution that leads nocturnal insects astray and interrupts their mating.”
This past October, yet more troubling news arrived.
Here we are, once again, at the end of a calendar year filled with lots of exciting news in the field of human evolution. Last year, just as we were finalizing edits on the 2017 Top 5 Human Evolution Discoveries list, the remainder of the skeleton of a human ancestor known colloquially as “Little Foot”(belonging to the genus Australopithecus, the same genus, but different species, as the famed “Lucy” fossil) was finally revealed after 20 years of cleaning and excavation from its embedding rock. Amazingly, just as we are finishing the edits for this year’s installment of top human evolution discoveries, Little Foot is back in the news. As of the last week of November, full descriptions and analyses of the remainder of the fossils are now available (prior to undergoing peer-review) on the preprint server bioRxiv. Enjoy reading our Top 6 list for 2018! Why 6? These stories are too cool not to share
Hello everybody !
I am studying possibilities for improving the performance of this website.
Perhaps Dreamhost will help me. We shall see.
Depending upon results, this may or may not be the final blog post.
I have several other things going on that demand my time and attention. You know, I’m closer to the end of my lifetime than the beginning, so there are priorities to consider.
I’m hoping to raise some money to get one of these, so I can get around on this mountain more easily.
Whatever happens, please enjoy a very Happy Christmas/Winter Solstice and I wish you all a safe, prosperous and interesting New Year.
Blessings to all my readers and supporters.