The Economic Burden of Dengue in Puerto Rico


Households incurred 48% of the cost of dengue illness, the government 24%, insurance 22%, and employers 7%.

Dengue incidence is increasing globally — but at what cost, and to whom?

In a recent, vaccine manufacturer–supported study, researchers used surveillance data, patient interviews, medical records, and financial data from patients, health facilities, and insurance companies to estimate the annual average aggregate costs of treated dengue cases in Puerto Rico from 2002 through 2010. Cost per case — used to project aggregate cost — was determined through detailed assessment of 100 patients with laboratory-confirmed dengue, of whom 67 were hospitalized and 44 were aged <15 years.

During the study period, >60,000 suspected dengue cases were reported; >22,000 of these cases were confirmed. The average annual aggregate cost of treated cases was $38.7 million ($10.40 per capita); adults accounted for 70% of these costs, and hospitalized patients for 63%. Households incurred 48% of such cost, the government 24%, insurance 22%, and employers 7%. When the cost of surveillance and control programs were added to the cost of dengue illness, the aggregate cost rose to $46.4 million ($12.47 per capita). Work absenteeism per episode exceeded that shown for influenza in the U.S. and other industrialized countries.

Comment: As noted by the authors and an editorialist, the economic burden of dengue has been poorly defined and generally underestimated. In Puerto Rico, households funded almost half the cost of illness. The cost of dengue illness there was roughly five times the cost of surveillance and vector-control programs, leading the authors to suggest that increases in the latter would pay off economically. This analysis is useful for planners considering investments in interventions, including vaccines, but because of regional differences in dengue epidemiology, costs will vary by geographic region.

Source: Journal Watch Infectious Diseases

 

Does Appropriateness of PCI Influence Procedural Outcomes?


An analysis of registry data reveals no such association, suggesting that patient selection and procedural performance should be assessed separately.

Appropriate use criteria (AUC) for percutaneous coronary intervention (PCI) have become a major quality metric for patient selection and resource utilization. In this analysis, investigators made use of the National Cardiovascular Data Registry to determine whether an association exists between adherence to AUC and in-hospital outcomes of PCI.

Included were 203,561 patients undergoing PCI for nonacute indications at 779 hospitals during 2009–2011. A total of 12.1% of the procedures were classified as inappropriate (range, 0%–56.6%). Compared with the hospital tertile with the lowest median rate of inappropriate PCI (5.3%), the tertile with the highest rate (20.0%) had similar risk-adjusted in-hospital mortality (0.3%; odds ratio, 1.12; P=0.35) and periprocedural bleeding rate (1.7%; OR, 1.02; P=0.07), as well as a similar rate of guideline-recommended medications provision at discharge (85.2%; P=0.58).

Comment: These investigators failed to find an association between inappropriate percutaneous coronary intervention and in-hospital outcomes, suggesting that patient selection for PCI by appropriate use criteria represents an aspect of PCI quality that differs from processes of care assessed by other metrics, including postprocedure outcomes. The wide range of variation in PCI appropriateness among hospitals is cause for concern, but these findings indicate that AUC adherence alone is an insufficient measure of the quality of PCI at any given hospital.

Source: Journal Watch Cardiology

Sensing the infrared: Researchers improve infrared detectors using single-walled carbon nanotubes


Whether used in telescopes or optoelectronic communications, infrared detectors must be continuously cooled to avoid being overwhelmed by stray thermal radiation. Now, a team of researchers from Peking University, the Chinese Academy of Sciences, and Duke University (USA) is harnessing the remarkable properties of single-walled carbon nanotubes (SWNTs) to create highly sensitive, “uncooled” photovoltaic infrared detectors.

This new type of detector, which the team describes in a paper published today in the Optical Society’s (OSA) open-access journal Optical Materials Express, may prove useful for industrial, military, manufacturing, optical communications, and scientific applications.

Carbon nanotubes are known for their outstanding mechanical, electrical, and optical properties. “They also are an ideal nanomaterial for infrared applications,” says Sheng Wang, an associate professor in the Department of Electronics at Peking University in Beijing, China, and an author of the Optical Materials Express paper. “For starters, these nanotubes exhibit strong and broadband infrared light absorption, which can be tuned by selecting nanotubes of different diameters. Also, due to their high electron mobility, nanotubes react very rapidly – on the order of picoseconds – to infrared light.” In comparison to traditional infrared detectors, which are based on semiconductors made of a mercury-cadmium-telluride alloy, the SWNTs are an order of magnitude more efficient, the researchers report.

The team’s photovoltaic infrared detector is formed by aligning SWNT arrays on a silicon substrate. The nanotubes arrays are then placed between asymmetric palladium and scandium contacts. These two metals have properties that collectively create what is known as an Ohmic contact, a region in a semiconductor device that has very low electrical resistance, which helps make the detector operate more efficiently.

“Fabrication of carbon nanotube infrared detectors can be readily implemented on a flexible substrate and large wafer at a low cost,” explains Wang.

The detector demonstrated “acceptable sensitivity” at room temperature and may be significantly improved by increasing the density of the carbon nanotubes, according to the team. The signal-to-noise performance of conventional infrared photodetectors is limited by their natural infrared emission, which is subsequently absorbed by the detector. To avoid having this stray radiation overwhelm the detector, liquid nitrogen or electric cooling is generally used to suppress this thermal effect. However, this makes infrared detectors more complex and expensive to operate. The new design eliminates this need because carbon nanotubes have special thermal properties. At room temperature, they emit comparatively little infrared radiation of their own, especially when the carbon nanotube is on the substrate. In addition, nanotubes are very good at conducting heat, so temperatures do not build up on the detector itself.

One of the biggest surprises for the team was achieving relatively high infrared detectivity (the radiation power required to produce a signal from a photoconductor) using a carbon nanotube thin film only a few nanometers thick, Wang points out. Notably, conventional infrared detectors require much thicker films, on the scale of hundreds of nanometers, to obtain comparable detectivity.

Another huge advantage of the detector is that the fabrication process is completely compatible with carbon nanotube transistors – meaning no big expensive equipment changes are necessary. “Our doping-free chemical approach provides an ideal platform for carbon nanotube electronic and optoelectronic integrated circuits,” says Wang.

The next step for the team is to focus on improving the detectivity of the detector with greater SWNT density, and to also achieve a wide spectrum response with improved diameter control.

More information: The paper, “Carbon Nanotube Arrays Based High-Performance Infrared Photodetector,” by Q. Zeng et al. will appear in a special feature issue on “Nanocarbon for Photonics and Optoelectronics” in Vol. 2, Issue 6 of Optical Materials Express.

Journal reference: Optical Materials Express

Source: Optical Society of America

 

Carl Sagan on Mastering the Vital Balance of Skepticism & Openness.


Fine-tuning the machinery of distinguishing the valid from the non-valid.

Seven years ago this week, David Foster Wallace argued that “learning how to think really means learning how to exercise some control over how and what you think.” Yet in an age of ceaseless sensationalism, pseudoscience, and a relentless race for shortcuts, quick answers, and silver bullets, knowing what to think seems increasingly challenging. We come up with tools like The Baloney Detection Kit and create wonderful animations to teach kids about critical thinking, but the art of thinking critically is a habit that requires careful and consistent cultivation. In his remarkable essay titled “The Burden of Skepticism,” originally published in the Fall 1987 issue of Skeptical Inquirer, Carl Sagan — always the articulate and passionate explainer — captured the duality and osmotic balance of critical thinking beautifully:

It seems to me what is called for is an exquisite balance between two conflicting needs: the most skeptical scrutiny of all hypotheses that are served up to us and at the same time a great openness to new ideas. Obviously those two modes of thought are in some tension. But if you are able to exercise only one of these modes, whichever one it is, you’re in deep trouble.

If you are only skeptical, then no new ideas make it through to you. You never learn anything new. You become a crotchety old person convinced that nonsense is ruling the world. (There is, of course, much data to support you.) But every now and then, maybe once in a hundred cases, a new idea turns out to be on the mark, valid and wonderful. If you are too much in the habit of being skeptical about everything, you are going to miss or resent it, and either way you will be standing in the way of understanding and progress.

On the other hand, if you are open to the point of gullibility and have not an ounce of skeptical sense in you, then you cannot distinguish the useful as from the worthless ones. If all ideas have equal validity then you are lost, because then, it seems to me, no ideas have any validity at all.

Some ideas are better than others. The machinery for distinguishing them is an essential tool in dealing with the world and especially in dealing with the future. And it is precisely the mix of these two modes of thought that is central to the success of science.

Source: http://www.brainpickings.org/

Chi-Huey Wong Awarded Nikkei Asia Prize.


Chi-Huey Wong, Scripps Research Institute Ernest W. Hahn Professor of Chemistry, has been awarded the Nikkei Asia Prize for science, technology, and innovation.

Wong, who also serves as president of Academia Sinica in Taiwan, was cited for his research in glycochemistry, “which has opened the way for the development of vaccines and medicines, especially in areas related to cancer, infectious, and immunological diseases,” according to the award announcement.

Launched by Nikkei Inc., a Japan-based global media corporation, the Nikkei Asia Prize recognizes achievements of individuals or organizations in Asia that have improved peoples’ lives in that world region. The awards program recognizes significant contributions in three areas: regional growth; science, technology and innovation; and culture.