Breakfast, lunch and dinner: Have we always eaten them?


British people – and many others across the world – have been brought up on the idea of three square meals a day as a normal eating pattern, but it wasn’t always that way.

People are repeatedly told the hallowed family dinner around a table is in decline and the UK is not the only country experiencing such change.

The case for breakfast, missed by many with deleterious effects, is that it makes us more alert, helps keep us trim and improves children’s work and behaviour at school.

But when people worry that breaking with the traditional three meals a day is harmful, are they right about the traditional part? Have people always eaten in that pattern?

Breakfast as we know it didn’t exist for large parts of history. TheRomans didn’t really eat it, usually consuming only one meal a day around noon, says food historian Caroline Yeldham. In fact, breakfast was actively frowned upon.

“The Romans believed it was healthier to eat only one meal a day,” she says. “They were obsessed with digestion and eating more than one meal was considered a form of gluttony. This thinking impacted on the way people ate for a very long time.”

In the Middle Ages monastic life largely shaped when people ate, says food historian Ivan Day. Nothing could be eaten before morning Mass and meat could only be eaten for half the days of the year. It’s thought the word breakfast entered the English language during this time and literally meant “break the night’s fast”.

Religious ritual also gave us the full English breakfast. On Collop Monday, the day before Shrove Tuesday, people had to use up meat before the start of Lent. Much of that meat was pork and bacon as pigs were kept by many people. The meat was often eaten with eggs, which also had to be used up, and the precursor of the full English breakfast was born.

But at the time it probably wasn’t eaten in the morning.

In about the 17th Century it is believed that all social classes started eating breakfast, according to chef Clarissa Dickson Wright. After the restoration of Charles II, coffee, tea and dishes like scrambled eggs started to appear on the tables of the wealthy. By the late 1740s, breakfast rooms also started appearing in the homes of the rich.

This morning meal reached new levels of decadence in aristocratic circles in the 19th Century, with the fashion for hunting parties that lasted days, even weeks. Up to 24 dishes would be served for breakfast.

The Industrial Revolution in the mid-19th Century regularised working hours, with labourers needing an early meal to sustain them at work. All classes started to eat a meal before going to work, even the bosses.

At the turn of the 20th Century, breakfast was revolutionised once again by American John Harvey Kellogg. He accidentally left some boiled maize out and it went stale. He passed it through some rollers and baked it, creating the world’s first cornflake. He sparked a multi-billion pound industry.

By the 1920s and 1930s the government was promoting breakfast as the most important meal of the day, but then World War II made the usual breakfast fare hard to get. But as Britain emerged from the post-war years into the economically liberated 1950s, things like American toasters, sliced bread, instant coffee and pre-sugared cereals invaded the home. Breakfast as we now know it.

 

The terminology around eating in the UK is still confusing. For some “lunch” is “dinner” and vice versa. From the Roman times to the Middle Ages everyone ate in the middle of the day, but it was called dinner and was the main meal of the day. Lunch as we know it didn’t exist – not even the word.

During the Middle Ages daylight shaped mealtimes, says Day. With no electricity, people got up earlier to make use of daylight. Workers had often toiled in the fields from daybreak, so by midday they were hungry.

“The whole day was structured differently than it is today,” says Day. “People got up much earlier and went to bed much earlier.”

By midday workers had often worked for up to six hours. They would take a quick break and eat what was known as a “beever” or “noonshine”, usually bread and cheese. As artificial light developed, dinner started to shift later in the day for the wealthier, as a result a light meal during the day was needed.

The origins of the word “lunch” are mysterious and complicated, says Day. “Lunch was a very rare word up until the 19th Century,” he says.

One theory is that it’s derived from the word “nuncheon”, an old Anglo-Saxon word which meant a quick snack between meals that you can hold in your hands. It was used around the late 17th Century, says Yeldham. Others theorise that it comes from the word “nuch” which was used around in the 16th and 17th Century and means a big piece of bread.

But it’s the French custom of “souper” in the 17th Century that helped shaped what most of us eat for lunch today. It became fashionable among the British aristocracy to copy the French and eat a light meal in the evening. It was a more private meal while they gamed and womanised, says Day.

It’s the Earl of Sandwich’s famous late-night snack from the 1750s that has come to dominate the modern lunchtime menu. One evening he ordered his valet to bring him cold meats between some bread. He could eat the snack with just one hand and wouldn’t get grease on anything.

Whether he was wrapped up in an all-night card game or working at his desk is not clear, both have been suggested. But whatever he was doing, the sandwich was born.

At the time lunch, however, was still known “as an accidental happening between meals”, says food historian Monica Askay.

Again, it was the Industrial Revolution that helped shape lunch as we know it today. Middle and lower class eating patterns were defined by working hours. Many were working long hours in factories and to sustain them a noon-time meal was essential.

Pies were sold on stalls outside factories. People also started to rely on mass-produced food as there was no room in towns and cities for gardens to keep a pig pen or grow their own food. Many didn’t even have a kitchen.

“Britain was the first country in the world to feed people with industrialised food,” says Day.

The ritual of taking lunch became ingrained in the daily routine. In the 19th Century chop houses opened in cities and office workers were given one hour for lunch. But as war broke out in 1939 and rationing took hold, the lunch was forced to evolve. Work-based canteens became the most economical way to feed the masses. It was this model that was adopted by schools after the war.

The 1950s brought a post-War world of cafes and luncheon vouchers. The Chorleywood Process, a new way of producing bread, also meant the basic loaf could be produced more cheaply and quickly than ever. The takeaway sandwich quickly began to fill the niche as a fast, cheap lunch choice.

Today the average time taken to eat lunch – usually in front of the computer – is roughly 15 minutes, according to researchers at the University of Westminster. The original meaning of lunch or “nuncheon” as a small, quick snack between proper meals is just as apt now as it ever was.

 

Dinner was the one meal the Romans did eat, even if it was at a different time of day.

In the UK the heyday of dinner was in the Middle Ages. It was known as “cena”, Latin for dinner. The aristocracy ate formal, outrageously lavish dinners around noon. Despite their reputation for being unruly affairs, they were actually very sophisticated, with strict table manners.

They were an ostentatious display of wealth and power, with cooks working in the kitchen from dawn to get things ready, says Yeldham. With no electricity cooking dinner in the evening was not an option. Peasants ate dinner around midday too, although it was a much more modest affair.

As artificial lighting spread, dinner started to be eaten later and later in the day. It was in the 17th Century that the working lunch started, where men with aspirations would network.

The middle and lower classes eating patterns were also defined by their working hours. By the late 18th Century most people were eating three meals a day in towns and cities, says Day.

By the early 19th Century dinner for most people had been pushed into the evenings, after work when they returned home for a full meal. Many people, however, retained the traditional “dinner hour” on a Sunday.

The hallowed family dinner we are so familiar with became accessible to all in the glorious consumer spending spree of the 1950s. New white goods arrived from America and the dream of the wife at home baking became a reality. Then the TV arrived.

TV cook Fanny Cradock brought the 1970s Cordon Bleu dinner to life. Many middle-class women were bored at home and found self-expression by competing with each other over who could hold the best dinner party.

The death knell for the family dinner supposedly sounded in 1986, when the first microwave meal came on to the market. But while a formal family dinner may be eaten by fewer people nowadays, the dinner party certainly isn’t over – fuelled by the phenomenal sales of recipe books by celebrity chefs.

Source:BBC

 

 

 

 

 

 

Why Gladwell’s 10,000-hour rule is wrong?


There’s no magic number for becoming a world-beater, says science writer David Bradley, just ask the psychologist whose research formed the basis of the popular idea.

Being exceptional at something is often attributed to one’s genes. Talent is passed down from parents or grandparents it seems, whether it is musical or artistic skill, ability with numbers or being great at juggling. No doubt there are significant genetic factors involved, but there are almost certainly environmental factors in the mix too. Perhaps the two work together, one boosting the other, so that those remarkable genes give rise to remarkable talent only if the skills are suitably nurtured.

However, many people now recognise that talent is learned and earned through extended and intense practice of a skill. No pain, no gain, as they say, in which case genes may have little to do with it.

This idea is encapsulated in a golden rule made popular by the writer Malcolm Gladwell in his book Outliers. This “10,000 hours of practice” rule is based on research by psychologist Anders Ericsson, now at Florida State University. The rule tells us, a mere 10,000 hours of dedicated practice in your particular field is sufficient to bring out the best in you. Is this true? Let’s trace how the rule emerged.

In essence, Ericsson’s theory suggests that sufficient practice in a particular skill can take anyone to a proficiency level equivalent to that of a top classical musician. To illustrate the point, Gladwell focuses on one of Ericsson’s key studies on violinists at Berlin’s Academy of Music. Students had begun playing at around five years of age, all putting in similar practice times, but by age eight, the practice times began to diverge, some practising more than others. By age twenty, the elite performers totalled 10,000 hours of practice each, while the merely good students had totalled 8,000 hours, and the lesser-able performers had just over 4,000 hours of practice.

Ericsson and his colleagues discovered a similar pattern in professional and amateur pianists. By the age of twenty, amateurs had put in 2,000 hours of practice, whereas professionals had done considerably more – reaching 10,000 hours, in fact. “The idea that excellence at performing a complex task requires a critical minimum level of practice surfaces again and again in studies of expertise,” writes Gladwell in Outliers.

Fab formula?

Gladwell points out that all great sportspeople, performers and even computer programmers got in their 10,000 hours of practice in their particular art early in life, allowing them to shine while their less-diligent contemporaries were still grappling with the basics. For instance, he cites the figure of 10,000 hours in connection with the early days of The Beatles when they played almost endless nights in the clubs and bars of Hamburg, Germany, between 1960 and 1964. This opportunity gave them something few musicians had during that era – plenty of time to practice. Ultimately, says Gladwell, this is what made the Fab Four top musicians and songwriters.

He also cites Bill Gates, the co-founder of computer software giant Microsoft, as a great example of the 10,000-hour rule. He had rare access to a computer in 1968 at the age of 13, at a time when most of his school friends in Seattle would have been playing baseball, or dreaming of putting flowers in their hair and heading to San Francisco. Gates spent night times and weekends with friends in the computer room, which gave him a substantial head start in the area of programming, and apparently allowed him to build his company at a much younger age than he might otherwise have been able to.

Many of us imagine that hours and hours spent on our chosen pursuit are somehow edging us towards that target of 10,000. I’ve played guitar since the age of 12, but I don’t imagine that I’m anything but a total amateur – musically speaking – I’ve not put in the dedicated, repetitive practice. Anyone who has heard me strumming might suggest that I plug headphones into my guitar amp and practise for another 10,000 hours before letting anyone ever hear me play again.

One person who might agree is Ericsson, the psychologist on whose research Gladwell apparently based his interpretation of the 10,000-hour rule. Not because he has heard me play, but because this rule is not quite as it may seem.

To notch up 10,000 hours would require about 90 minutes of practice every day for 20 years. This might explain why the typical child learning the piano will never make it to concert level. Three hours a day gets you to that stage within a decade, so start at the age of ten and you’re done before you’re out of your teens.

Unfortunately, the moment the 10,000-hour mark is reached is not a skills tipping point – to use another phrase popularised by Gladwell. Learning and gaining experience are gradual processes; skills evolve slowly, with practice. And there is a vast range of time periods over which different individual reach their own peak of proficiency – their concert level, you might say – in whatever field.

Unattainable goal

Returning to Ericsson’s original study on violinists, they did indeed find that the best of Berlin’s Academy of Music’s best spent significantly more time practicing than lesser-accomplished musicians. But there is nothing magical about the 10,000 figure, as Ericsson said recently, because the best group of musicians had accumulated an average, not a total, of over 10,000 hours by the age of twenty. In the world of classical music it seems that the winners of international competitions are those who have put in something like 25,000 hours of dedicated, solitary practice – that’s three hours of practice every day for more than 20 years.

In fact, one can attain international-level status in less time, especially if the area is less competitive. For instance, Ericsson and colleagues have found that college students could reach a world-class performance formemorising digits after only 500 to 1,000 hours of training.

Ericsson is also on record as emphasising that not just any old practice counts towards the 10,000-hour average. It has to be deliberate, dedicated time spent focusing on improvement. Not all the examples in Gladwell’s book qualify as such deliberate practice: writing computer programs and playing ice-hockey matches, for instance, may not count. It’s not a matter of simply taking part in an activity, Ericsson argues. Sportspeople have other considerations, for instance, there are physical limits on how much dedicated practice is possible.

But the question of whether or not 10,000, or even 25,000 hours of practice is enough does not tell us anything about whether some people are born with a particular talent. We do not yet know whether anyone with strong enough motivation and the spare time could become a virtuoso simply through deliberate practice, year in year out.

Scientifically speaking, 10,000 hours is not a precise figure but shorthand for “lots and lots of dedicated practice”. Even 10,000 hours of dedicated practice may not be enough to give you the skills of a virtuoso. But whether you dream of playing at the concert hall, wielding the guitar, or taking part on the running track, 10,000 hours is a good starting point. Double that and you may even be winning international competitions.

However you look at it, being the best requires a lot of time and effort, and few people are willing to dedicate so much of their lives to a single pursuit. So while practice may get some of us close to perfection, for many of us it is an unattainable goal. That’s no reason not to give it a try, of course. Some day, I might even unplug those headphones once more.

Source:BBC

Great whites ‘not evolved from megashark.


A new fossil discovery has helped quell 150 years of debate over the origin of great white sharks.

Carcharodon hubbelli, which has been described by US scientists, shows intermediate features between the present-day predators and smaller, prehistoric mako sharks.

The find supports the theory that great white sharks did not evolve from huge megatooth sharks.

The research is published this week in the journal Palaeontology.

Palaeontologists have previously disagreed over the ancestry of the modern white sharks, with some claiming that they are descended from the giant megatooth sharks, such as Megalodon (Carcharocles megalodon).

“When the early palaeontologists put together dentitions of Megalodon and the other megatooth species, they used the modern white shark to put them together, so of course it’s going to look like a white shark because that’s what was used as a model,” explained Professor Dana Ehret of Monmouth University in New Jersey who lead the new research.

Modern day white sharks show similarities in the structure of their teeth with the extinct megatooth sharks.

As they both sport serrations on the cutting edges, early scientists working on the animals used this as evidence for the sharks being closely related.

“But we actually see the evolution of serrations occurring many times in different lineages of sharks and if you look at the shape and size of the serrations in the two groups you see that they are actually very different from each other,” Professor Ehret told BBC News.

“White sharks have very large, coarse serrations whereas megalodon had very fine serrations.”

Now, additional evidence from the newly described species shows both white shark-like teeth shape as well other features characteristic of broad-toothed mako sharks that feed on smaller fish rather than primarily seals and other large mammals.

“It looks like a gradation or a transition from broad-toothed makos to the modern white shark. It’s a transitional species, and you don’t see that a whole lot in the fossil record,” Professor Ehret said.

The mako-like characteristics of the new species, named Carcharodon hubbelli in honour of Gordon Hubbell – the researcher who discovered it in the field – were only found due to the incredible preservation of the fossil.

“A big issue in shark palaeontology is that we tend to only have isolated teeth, and even when you find associated teeth very, very rarely are they articulated in a life position,” continued Professor Ehret.

“The nice thing about this new species is that we have an articulated set of jaws which almost never happens and we could see that the third anterior tooth is curved out, just like in the tooth row of mako sharks today,” he said.

David Ward, an associate researcher at the Natural History Museum, London, who was not involved in the study told BBC News: “Everyone working in the field will be absolutely delighted to see this relationship formalised.”

The mosaic of both white shark-like and mako-like characters had been spotted by the researchers in an initial description of the fossil, but the age of the fossil meant their conclusion that the species was intermediate between a mako ancestor and modern white sharks wasn’t fully accepted.

“Some folks said ‘well, it makes a great story, but it’s not old enough because by this time, the early Pliocene, we see full blown white sharks in the ocean.'”

This led Ehret and his team to revisit the original site the fossil was taken from the Pisco Formation in Peru to re-examine the geology of the area, guided by the original field notes of Gordon Hubbell.

“Gordon gave us two photographs from when he actually collected the specimen and then a hand drawn map with a little ‘X’ on it. We tried to use the map and we didn’t have much luck.

“But using the two pictures of the excavation, my colleague Tom Devries was able to use the mountains in the background.”

“We literally walked through the desert holding the pictures up, trying to compare them. That’s how we found the site.”

Not only did they find the site, but the team were able to discover the precise hole from which the fossil had been excavated in 1988, before making a lucky escape from the desert.

“We made it back to Lima with about three hours to spare before an earthquake hit and shut down the transcontinental highway for two weeks. It was quite a trip.”

By analysing the species of molluscs found fossilised at the site, the team found that the shark was actually two million years older than had been thought, making it roughly 6.5 million years old.

“That two million year push-back is pretty significant because in the evolutionary history of white sharks, that puts the species in a more appropriate time category to be ancestral or… an intermediate form of white shark.”

“We’ve bolstered the case that white sharks are just highly modified makos… It fits the story now,” Professor Ehret told BBC News.

Source:BBC

 

World’s leggiest millipede put under microscope.


The anatomical secrets of the world’s leggiest creature, a millipede with 750 legs, have been revealed by scientists.

The species, called Illacme plenipes, was first seen 80 years ago but was recently rediscovered in California.

Now researchers have found that as well as bearing an extreme number of legs, the creature may have more in common with millipedes that lived millions of years ago than today’s species.

The study is published in the journal ZooKeys.

“It’s a kind of mythical creature in the millipede world,” said Dr Paul Marek, an entomologist from the University of Arizona and the lead author of the paper.

Record breaker

In 2005, Dr Marek and his brother discovered some of the leggy arthropods lurking under boulders in the mountains of California. Until then, I.plenipes had not been glimpsed since 1926.

A paper published in the journal Nature outlined the rediscovery and described the creature’s basic biology, but the new research looked at the creature’s anatomy in much more detail.

Despite the name – most millipedes have far fewer than 1,000 feet. Most species belonging to the most common order Polydesmida have an average of just 62.

But Dr Marek confirmed that I.plenipes safely holds the record for the leggiest creature: females can have up to 750 legs, while males have up to 562.

“It seems like these legs evolved for their subterranean lifestyle,” he explained.

“They live deep underground: we found them about 10-15cm (4-6in) below the soil’s surface.

“They are typically found clinging onto sandstone boulders. Based on functional morphology of closely related species, it seems like all of these legs evolved to burrow under the ground and to cling onto these large boulders.”

Though they have many limbs, the creatures are small, measuring about about 3cm-long (1in).

Open wide

Close examination of the creature revealed that it had some ancient features.

 “Start Quote

Its anatomy retains a number of primitive characteristics”

Dr Paul MarekUniversity of Arizona

Most millipedes chew leaves and decaying vegetation with grinding mouth parts.

But the scientists found that this species had a more rudimentary anatomy. Its jaws are fused to its head, and Dr Marek believes that it pierces and then sucks up plant and fungal tissues to satisfy its appetite.

The creature’s body segments were also more similar to ancient millipedes than to most other species found today.

Dr Marek said that millions of years ago creatures like I.plenipes would have been widespread, but now it was one of the last of its kind.

He explained: “It is a relict species. Its most closely related lineages are in South Africa and there is nothing related to this species in the entire North America region. Its anatomy retains a number of primitive characteristics.”

The animal, which also has a number of other unusual features such as body hairs that secrete silk, is thought to be extremely rare and only found in a small area close to San Francisco.

Dr Marek said: “Based on our search of the area, it seem like it is known in three spots – and these spots are about 4.5km (3 miles) away from one and other.

“It does seem that this creature is restricted both in terms of geography and also evolutionarily.”

Dr Marek, a self-confessed millipede enthusiast, said that rediscovering the record breaking creature was a staggering experience, but that it would be “awesome” if someone unearthed an even leggier creature.

He said: “The name millipede would no longer be a misnomer – it would only need to add a few more segments to get an even 1,000 (legs), which would be fantastic.”

Source:BBC

 

 

Unmanned aircraft project leads push to civilian drones.


The “Pandora’s box” of unmanned aircraft in the UK has been opened, according to the Astraea consortium.

Yet many technology and ethics issues surrounding civilian drones are yet to be solved, journalists at London’s Science Media Centre were told.

The UK-led, £62m Astraea project – which has participation of the UK Civil Aviation Authority – is attempting to tackle all facets of the idea.

Later in November, they will carry out a crucial collision-avoidance test.

Unmanned aircraft or UAs is something of a new name for drones, which have gained notoriety principally in the theatre of war where remotely operated aircraft are used for surveillance or air strikes.

But the same technology put to use for civilian purposes is already a hot topic of debate in the UK and abroad, most recently surrounding their use by London’s Metropolitan Police.

 “Start Quote

It’s not just the technology, we’re trying to think about the social impact of this”

Lambert Dopping-HepenstalAstraea project director

A recent report by the UK’s Aerospace, Aviation and Defence Knowledge Transfer Network (KTN) found that applications for unmanned aircraft are said to be worth some £260bn – replacing costly or dangerous work done by manned planes, or opening up new applications that are currently out of reach.

Crop or wildlife stock monitoring, search and rescue, and check-ups on railway lines are some of the envisioned uses of UAs.

“All these things are currently done by manned aircraft, and they’re done in currently quite hazardous environments,” said Ruth Mallors, director of the Aerospace KTN.

“We want to use unmanned aircraft in these applications, but to be able to do that we have to demonstrate that were complying with the Civil Aviation Authority regulations, which are for manned aircraft.

“There’s not going to be any new regulations – we’ll comply with the regulations in place.”

That is what brings about the technological challenge. The project involves sensors to be the “eyes” of a UA, the software to carry out manoeuvres and collision avoidance, and the aircraft themselves.

Points of debate

Plans for UAs envision that a pilot will always be on the ground controlling them, but they must have on-board technology that can perform in an emergency – in the eyes of aviation law – as well as a pilot.

“These things are going to have a level of self-determinism, particularly if you ever lose the communication link with the ground control,” said Lambert Dopping-Hepenstal, Astraea project director. “They’ve got to be able to operate fully safely and take the right decisions.

Drones and the UK

 

  • It is legal to fly your own drone in the UK without any special permission if it weighs less than 20kg and flies more than 150m from a congested area
  • CAA permission is required if it is used for a commercial activity such as aerial photography
  • Permission has been given for inspecting power lines, police use and crop surveillance
  • Direct visual contact with the drone is currently required at all times
  • Drones larger than 20kg would have to be approved for use by the CAA for use in UK airspace in the same way as commercial aircraft
  • The CAA has made clear that it will not approve their use until it is convinced the drone can automatically “sense and avoid” other aircraft

“But we’re not talking about unthinking drones, we’re not talking about irrational and unpredictable behaviour, and we’re not talking about something that gets itself up in the morning, goes off and does its own things and comes home without any human oversight.”

The project has the participation of major contractors including BAE Systems, Rolls-Royce and Thales UK. But they are also working closely with the Civil Aviation Authority, who will ultimately control the licensing for UAs when they pass stringent safety tests.

Gary Clayton, head of research and technology for EADS Cassidian, another project partner, said the CAA’s publication CAP722 is being held up internationally as a template for aviation legislation around UAs.

But Mr Dopping-Hepenstal said the project is aiming much further than the technology and safety legislation.

“What this programme is trying to do is look at this holistically,” he said. “It’s not just the technology, we’re trying to think about the social impact of this and the ethical and legal things associated with it. You’ve got to solve all this lot if you’re going to make it happen, enable it to happen affordably.”

Chris Elliot, an aerospace engineer and barrister, is acting as consultant to the project. He told reporters that the licensing and privacy questions were points “to debate, not to pontificate”.

“We have a very robust privacy regime now for aviation, and I don’t see much very different. A lot of it comes down to what society thinks is acceptable,” he said.

“I find it interesting that Google has got away with its [Streetview] because we love Google and we all use it. If this technology positioned to something that is good for us, that we like, then people will accept that kind of behaviour.

“Pandora’s box is open – these things are going to fly. What we need is to engage everybody, the public and the specialists, with understanding the good and bad sides.”

For now, though, safety is paramount. The Astraea project will carry out real-world collision-avoidance tests using three planes in two weeks’ time, putting their autonomous control software through its paces and ensuring that unmanned aircraft can independently avoid a crash.

Source:BBC

 

 

 

Call for global crackdown on fake medicines.


Counterfeit drugs may contain harmful ingredients or no active ingredient at all.

A global treaty to crack down on the deadly trade of fake medicines is urgently needed, say experts.

Currently, there are more sanctions around the use of illegal tobacco than counterfeit drugs.

Writing in the British Medical Journal, experts urge the World Health Organization to set up a framework akin to its one tobacco control to safeguard the public.

WHO says more than one in every 10 drug products in poorer nations are fake.

A third of malaria drugs are counterfeit, research suggests.

In richer countries, medicine safety is better, but substandard and falsified drugs still cause thousands of adverse reactions and some deaths.

Recently, in the US, contaminated drug supplies caused an outbreak of meningitis that has so far killed 16 people.

Global problem

Amir Attaran and colleagues from the World Federation of Public Health Associations, International Pharmaceutical Federation and the International Council of Nurses, say while governments and drug companies alike deplore unsafe medicines, it is difficult to achieve agreement on action because discussions too often trespass into conflict-prone areas such as pharmaceutical pricing or intellectual property rights.

 “Start Quote

In Canada we have seen a fake version of the heart drug Avastin come into the country that contains no active drug, just starch and nail polish remover”

Although some countries prohibit fake medicines under national law, there is no global treaty which means organised criminals can continue to trade using haven countries where laws are lax or absent.

WHO estimates nearly a third of countries have little or no medicine regulation.

In other contexts, global treaties have helped governments strengthen their laws and cooperate internationally to clamp down on havens – for example, on money laundering.

Similarly, a new protocol under the Framework Convention on Tobacco Control requires tobacco products to be tracked and criminalises illicit trade globally – “oddly making the law tougher on cigarette falsification than on medicine falsification”, says Amir Attaran.

“The protocol will now make it a requirement to track and trace tobacco products. Cigarette packets can carry serial numbers so it is possible to track them from beginning to end.

“If this is something you can do for a $5 cigarette packet I do not see why we can’t do it for a $3,000 packet of drugs that could save your life.

“In Canada we have seen a fake version of the heart drug Avastin come into the country that contains no active drug, just starch and nail polish remover.

“When you are dealing with a medicine like that if there was a serial number on it you would be able to easily see if it was fake.”

WHO says it provides direct country and regional support for strengthening medicines regulation.

And it is up to its 194 member states to decide if a treaty is the way forward.

In 2011, a directive to protect patients from fake medicines was approved by the European Parliament.

Source:BBC

 

 

Ebola outbreak in Uganda kills two.


Up to 90% of those who contract Ebola die from the virus.

A fresh outbreak of the deadly Ebola virus in Uganda has killed at least two people, the health minister has said.

Christine Ondoa said two members of the same family died over the weekend not far from the capital – and a third person was also suspected to have died in that area of the haemorrhagic fever.

An estimated 17 people died in western Uganda during an outbreak in July.

According to the medical charity Medecins Sans Frontieres (MSF), there had been no cases since August.

‘Avoid gatherings’

Dr Ondoa said that investigators had found conclusive evidence of Ebola in Luweero, about 60km (37 miles) from the capital, Kampala.

A third man had also died in the area late last month after showing symptoms of Ebola however no samples were taken from the victim and the case was not reported to health officials at the times, she said.

Five people who came into contact with those who died are being monitored. Two of them have been admitted to an isolation unit at Kampala’s main Mulago hospital, the minister said.

 

There is no known cure for Ebola, but patients can be treated for their symptoms with antibiotics, drugs for pain relief and for other diseases such as malaria, to strengthen their resistance.

The virus causes death in 90% of human cases.

Dr Ondoa said the disease is “very infectious” and kills “in a short time”, but is “easily” preventable.

Among precautionary measures she urged people to take were:

  • Avoid public gatherings, including funerals, in affected districts
  • Bury victims immediately under the supervision of health officials
  • Avoid direct contact with body fluids of Ebola patients by using gloves and masks
  • Disinfect the bedding and clothing of an infected person and
  • Avoid eating dead animals, especially monkeys.

Uganda has seen several major Ebola outbreaks over the past 12 years.

The deadliest was in 2000 when 425 people were infected. More than half of them died.

The BBC’s Catherine Byaruhanga in Kampala says many Ugandans are wondering why the country is so prone to Ebola outbreaks.

The government has said it is because its systems are getting better at detecting them.

Source:BBC

 

BP to pay record fine for 2010 spill.


BP Plc is expected to pay a record U.S. criminal penalty and plead guilty to criminal misconduct in the 2010 Deepwater Horizon disaster through a plea deal reached with the Department of Justice (DoJ) that may be announced as soon as Thursday, according to sources familiar with discussions.

Three sources, who spoke to Reuters on condition of anonymity, said BP would plead guilty in exchange for a waiver of future prosecution on the charges.

BP confirmed it was in “advanced discussions” with the DoJ and the Securities & Exchange Commission (SEC).

The talks were about “proposed resolutions of all U.S. federal government criminal and SEC claims against BP in connection with the Deepwater Horizon incident,” it said in a statement on Thursday, but added that no final agreements had been reached.

The discussion do not cover federal civil claims, both BP and the sources said.

London-based oil giant BP has been locked in months-long negotiations with the U.S. government and Gulf Coast states to settle billions of dollars of potential civil and criminal liability claims resulting from the April 20, 2010, explosion aboard the Deepwater Horizon rig.

The sources did not disclose the amount of BP’s payment, but one said it would be the largest criminal penalty in U.S. history. That record is now held by Pfizer Inc, which paid a $1.3 billion fine in 2009 for marketing fraud related to its Bextra pain medicine.

The DoJ declined to comment.

The deal could resolve a significant share of the liability that BP faces after the explosion killed 11 workers and fouled the shorelines of four Gulf Coast states in the worst offshore spill in U.S. history. BP, which saw its market value plummet and replaced its CEO in the aftermath of the spill, still faces economic and environmental damage claims sought by U.S. Gulf Coast states and other private plaintiffs.

The fine would far outstrip BP’s last major settlement with the DoJ in 2007, when it payed about $373 million to resolve three separate probes into a deadly 2005 Texas refinery explosion, an Alaska oil pipeline leak and fraud for conspiring to corner the U.S. propane market.

The massive settlement, which comes a week after the U.S. presidential election, could ignite a debate in Congress about how funds would be shared with Gulf Coast states, depending on how the deal is structured. Congress passed a law last year that would earmark 80 percent of BP penalties paid under the Clean Water Act to the spill-hit states of Louisiana, Mississippi, Alabama, Florida and Texas.

POTENTIAL LIABILITY

In an August filing, the DoJ said “reckless management” of the Macondo well “constituted gross negligence and willful misconduct” which it intended to prove at a civil trial set to begin in New Orleans in February 2013. The U.S. government has not yet filed any criminal charges in the case.

Given that the deal will not resolve any civil charges brought by the Justice Department, it is also unclear how large a financial penalty BP might pay to resolve the charges, or other punishments that BP might face.

Negligence is a central issue to BP’s potential liability. A gross negligence finding could nearly quadruple the civil damages owed by BP under the Clean Water Act to $21 billion in a straight-line calculation.

Still unresolved is potential liability faced by Swiss-based Transocean Ltd, owner of the Deepwater Horizon vessel, and Halliburton Co, which provided cementing work on the well that U.S. investigators say was flawed. Both companies were not immediately available for comment.

According to the Justice Department, errors made by BP and Transocean in deciphering a pressure test of the Macondo well are a clear indication of gross negligence.

“That such a simple, yet fundamental and safety-critical test could have been so stunningly, blindingly botched in so many ways, by so many people, demonstrates gross negligence,” the government said in its August filing.

Transocean in September disclosed it is in discussions with the Justice Department to pay $1.5 billion to resolve civil and criminal claims.

The mile-deep Macondo well spewed 4.9 million barrels of oil into the Gulf of Mexico over a period of 87 days. The torrent fouled shorelines from Texas to Florida and eclipsed in severity the 1989 Exxon Valdez spill in Alaska.

BP has already announced an uncapped class-action settlement with private plaintiffs that the company estimates will cost $7.8 billion to resolve litigation brought by over 100,000 individuals and busine

Moderate drinking in pregnancy ‘harms IQ’.


Current government advice is to avoid drinking alcohol during pregnancy

Drinking one or two glasses of wine a week during pregnancy can have an impact on a child’s IQ, a study says.

Researchers from Oxford and Bristol universities looked at the IQ scores of 4,000 children as well as recording the alcohol intake of their mothers.

They found “moderate” alcohol intake of one to six units a week during pregnancy affected IQ.

Experts said the effect was small, but reinforced the need to avoid alcohol in pregnancy.

Previous studies have produced inconsistent and confusing evidence on whether low to moderate levels of alcohol are harmful in pregnancy, largely because it is difficult to separate out other factors that may have an effect such as the mother’s age and education.

But this research, published in the PLOS One journal, ruled that out by looking at changes in the genes that are not connected to social or lifestyle effects.

‘Why take the risk?’

The study found that four genetic variants in alcohol-metabolising genes in children and their mothers were strongly related to lower IQ at age eight.

On average, the child’s IQ was almost two points lower per genetic modification they possessed.

But this effect was only seen among the children of women who drank between one and six drinks a week during pregnancy and not among women who abstained when they were pregnant.

The researchers said although a causal effect could not be proven, the way they had done the study strongly suggested that it was exposure to alcohol in the womb that was responsible for the differences in child IQ.

Dr Ron Gray, from Oxford University, who led the research added that although the differences appeared small, they may well be significant and that lower IQ had been shown to be associated with being socially disadvantaged, having poorer health and even dying younger.

“It is for individual women to decide whether or not to drink during pregnancy, we just want to provide the evidence.

“But I would recommend avoiding alcohol. Why take the risk?”

A Department of Health spokesman said that since 2007 their advice had been that women who are trying to conceive or are pregnant should avoid alcohol.

But Dr Clare Tower, consultant in obstetrics and fetal maternal medicine, at St Mary’s Hospital, Manchester, stressed that women who have had the occasional alcoholic drink in pregnancy should not be overly alarmed by the findings.

“Current UK advice is that the safest course of action is abstinence during pregnancy.

“The finding of this study would concur that this is undoubtedly the safest advice.”

But she pointed out that another recent study had found no effect on IQ at five years.

“It is likely therefore, that any impact is likely small and not seen in all women.”

Source: BBC

A 14 years old guy name Gerrit Blank survived a direct hit by a meteorite as it moved to Earth at more than 30,000 mph.. that sound scary, isn’t it?


As explained by The Telegraph reports. Gerrit Blank was on his way to school and he saw a “ball of light” heading straight towards him from the sky.it was a red hot, pea-sized piece of rock then hit his hand before bouncing off and causing a foot wide crater in the ground

The teenager survived the strike, the chances of which are just 1 in a million – but with a nasty three-inch long scar on his hand.

“At first I just saw a large ball of light, and then I suddenly felt a pain in my hand.Then a split second after that there was an enormous bang like a crash of thunder.The noise that came after the flash of light was so loud that my ears were ringing for hours afterwards.When it hit me it knocked me flying and then was still going fast enough to bury itself into the road,” he explained.I am really keen on science and my teachers discovered that the fragment is really magnetic,” said Gerrit.

Scientists researched on that pea-sized meteorite which crashed to Earth in Essen, Germany and admitted that i had fallen from the space. Ansgar Kortem, director of Germany’s Walter Hohmann Observatory, said: “It’s a real meteorite, therefore it is very valuable to collectors and scientists.”

Source: http://www.mostbeautifulpages.com