Gene Therapy Administered Via Virus Cures Deafness in 11-Year-Old Boy


“There’s no sound I don’t like.”

Getty / Futurism

An 11-year-old boy with congenital deafness can now hear sound after he received groundbreaking gene therapy that replaces a mutated gene with the correct version, The New York Times reports.

“There’s no sound I don’t like,” said the boy Aissam Dam via interpreters to the NYT. “They’re all good.”

Dam’s deafness was due to a mutation to a gene called otoferlin, according to the news outlet. The gene makes a protein that’s a key component in relaying sound between the inner ear and brain, but a mutated version of the otoferlin gene impedes this process, impacting around 200,000 people across the globe.

In early October last year, Dam was the first person in America to receive the otoferlin gene therapy as part of a clinical study by pharmaceutical company Eli Lilly and Akouos, a gene therapy business Eli Lilly bought in 2022.

At the Children’s Hospital of Philadelphia, researchers injected into one of Dam’s ears liquid containing a benign virus carrying functional copies of the otoferlin genes. Specifically, they squirted the normal genes into his cochlea, a spiral-shaped hollow in the inner ear that’s filled with liquid and lined with hair cells that send sound information to the brain.

It took a short time — mere days — for the gene therapy to do its miraculous work, the NYT reports. Dam’s father said his son was picking up traffic sounds with his treated ear, and now its ability to hear is “close to normal.”

Two similar studies were done in China recently, with one of them already showing remarkable results. There are also two studies in Europe that are either in progress or planned that will target the otoferlin mutation.

With Dam’s newfound hearing, the NYT reports that researchers are going to continue the study and enroll younger patients.

Even though this particular type of gene therapy targets a rare type of mutation in the inner ear, it also opens up the possibility of treating other forms of congenital deafness.

NICE recommends genetic test to prevent deafness from antibiotics in newborn babies


A genetic test to establish whether a newborn baby is vulnerable to deafness if treated with gentamicin has been recommended in draft guidance by the National Institute for Health and Care Excellence (NICE).1

The Genedrive kit works by detecting the m.1555A>G variant from a swab of DNA from inside a newborn’s cheek, with results available in under an hour. If the m.1555A>G variant is found the baby can be treated with alternative antibiotics, which cannot be more widely used because of antibiotic resistance.

At present laboratory testing would not produce results within an hour in line with guideline recommendations, and babies who go deaf after being given gentamicin are discovered to have the variant only afterwards, with DNA testing. The estimated cost of treating hearing loss with a bilateral cochlear implant is around £65 000 (€73 200; $78 500) in the first year.

Evidence presented to the NICE committee from the Paloh study2 carried out in Manchester and Liverpool showed no statistically significant difference between the time to antibiotic treatment between standard care and when using the Genedrive device, suggesting that the test would not delay the time taken to administer antibiotics.

Gathering evidence

The Genedrive kit has been assessed through NICE’s Early Value Assessment pilot project, which has been created to enable earlier access to digital products, medical devices, and diagnostics to tackle national unmet needs in health and social care.

Once the kit is in use the NHS will collect real world evidence to ensure that the test can be applied in a variety of maternity settings and that it does not lead to increased use of antibiotics associated with higher risk of antimicrobial resistance or a longer time to antibiotic treatment. The results will then be scrutinised by the independent NICE committee as part of the kit’s full assessment.

Mark Chapman, interim director of medical technology at NICE, said, “The costs associated with hearing loss to the NHS are high, so driving an innovation like Genedrive into the hands of health and care professionals to enable best practice can also ensure that we balance the best care with value for money, delivering both for individuals and society as a whole.”

Susan Daniels, chief executive of the National Deaf Children’s Society and lay specialist committee member, said, “Speaking both as a deaf person and as chief executive of the National Deaf Children’s Society, it’s very encouraging that more evidence will be gathered on this important development. I hope this additional evidence will support the argument for the rollout of technology, which could play a pivotal role in preventing deafness in a small number of babies in the future.”

Ring and bracelet system designed to help the hearing-impaired.


Take rings, add a bracelet, and you have a helping mechanism for the hearing-impaired in a novel design. For people who have hearing handicaps and do not know sign language, the ring and bracelet system can help them out, both in communicating what they need to say and in getting messages they can read. First, a Sign Language Ring behaves as a translating device that picks up motion and gestures and translates them into words, delivered through voice by the bracelet. The bracelet can translate spoken words into its readable display panel for the wearer to read. After use, the rings can be set into the bracelet for storage.

https://i0.wp.com/cdn.physorg.com/newman/gfx/news/2013/sdfhgkhdf.jpg

The design was inspired by Buddhist prayer beads. The name of the entire system is the Sign Language Ring, which is actually a set of rings and a bracelet. In all, six gesture-detecting finger rings can be snapped and stored on the bracelet. The user can program certain gestures to a specific word if desired. The speaker box and readable display are wrapped around the bracelet. After use, the rings can be set into the for storage.

Sign Language Ring is a 2013 winner of the red dot award for design concept. The red dot award for design concept is an annual design competition for design concept and prototypes. Winning concepts are exhibited at the red dot museum in Singapore for at least one year.

This attempt comes at a time when wearable technologies market watchers are recognizing a subset that carries ample opportunities for growth, and that is wearables as disability technologies for the deaf, blind, paralyzed, and elderly. In turn, there is interest in “hear ware,” which would include embedding jewelry with technologies that can help those who have hearing difficulties.

https://i0.wp.com/cdn.physorg.com/newman/gfx/news/2013/hbjdgvbh.jpg

In a GigaOm Pro article titled “The wearable computing market: a global analysis by Jody Ranck, the author made note of the 2006 event in London, where the Victoria and Albert Museum hosted an exhibition on hear ware. These were technologies developed in response to a call from the UK Design Council to rethink the hearing aid. The result, said the author, was a fascinating array of wearable technologies outfitted with sensors and hearing devices.

Deaf Student, Denied Interpreter by Medical School, Draws Focus of Advocates.


Speaking with the parents of a sick infant, Michael Argenyi, a medical student, could not understand why the child was hospitalized. During another clinical training session, he missed most of what a patient with a broken jaw was trying to convey about his condition.

His incomprehension, Mr. Argenyi explained, was not because of a deficiency in academic understanding. Rather, he simply could not hear.

DEAF-popup

Mr. Argenyi, 26, is legally deaf. Despite his repeated requests to use an interpreter during clinical training, administrators at the Creighton University School of Medicine in Omaha, Neb., have refused to allow it. They have contended that Mr. Argenyi, who is able to speak, communicated well enough without one and that patients could be more hesitant to share information when someone else was present. They added that doctors needed to focus on the patient (not a third party) to rely on visual clues to make a proper diagnosis.

Mr. Argenyi took a leave of absence at the end of his second year, in 2011, after suing Creighton for the right to finish his medical training with an interpreter. The case, scheduled to go to trial on Tuesday in Federal District Court in Omaha, is attracting the attention of the federal government and advocates who are concerned that it could deal a setback to continuing efforts to achieve equality for people with disabilities.

“I couldn’t understand so much of the communication in the clinic,” Mr. Argenyi wrote in an e-mail. “It was humiliating to present only half of a history because I had missed so much of what was communicated. I was embarrassed every time I would miss medicine names that I knew from classes but couldn’t understand when the patient or a colleague spoke them.”

Despite making tremendous strides over the past four decades with the passage of theRehabilitation Act and the Americans with Disabilities Act, those with disabilities remain underrepresented in higher education and in the work force. In the medical field, people who are deaf or hard of hearing remain less likely to hold high-skilled positions than those without impairments.

Universities tend to provide requested accommodations after admitting a student who they know has a disability, proponents for the deaf say. And most arrangements for the deaf are settled long before any issues reach a courtroom, said Curtis Decker, the executive director of the National Disability Rights Network, a federally financed association of legal services programs.

But, he said of Mr. Argenyi’s lawsuit, “It’s a very important case because, I think, if it’s successful it will send a very powerful message to the university community that the law does cover them and the law is clear about the accommodations that they need to provide.”

Creighton officials maintain that they have provided Mr. Argenyi with the necessary tools for him to succeed in medical school.

“Michael Argenyi is a very bright, capable young man who Creighton believes will make a good doctor,” said Scott Parrish Moore, the lead counsel for Creighton.

After being accepted to Creighton four years ago, Mr. Argenyi asked the university to provide a real-time captioning system for lectures and a cued speech interpreter. (Mr. Argenyi, who does not know sign language, can read lips. An interpreter helps by mouthing words while using hand signals to clarify sounds.) These were the same accommodations that Mr. Argenyi, who had a diagnosis of profound deafness when he was 8 months old, received for much of his schooling, from grade school through undergraduate studies at Seattle University.

Creighton provided Mr. Argenyi with just one of the aides that his audiologist had recommended — an FM system, which amplifies the sounds he hears in cochlear implants. The university also provided note takers for lectures, priority seating and audio podcasts.

Soon after classes began, Mr. Argenyi told school officials that the accommodations were inadequate and that he was missing information. He sued in federal court in Omaha in September 2009, arguing that the university was legally required to pay for and provide necessary aides.

Mr. Argenyi said he hired his own interpreter and transcription service, which cost him more than $100,000 during his two years in medical school. The breaking point, he said, came during his clinical work in his second year when Creighton refused to allow him to use an interpreter, even if he paid for it himself. The university did allow Mr. Argenyi to use interpreters during a couple of clinics while the Justice Department was trying to broker a settlement, but stopped when a deal could not be reached.

Mr. Argenyi is pursuing degrees in public health and social work at Boston University, which is providing his requested transcription services, while the lawsuit is pending.

 

Source: http://www.nytimes.com

Maternal Prenatal Smoking and Hearing Loss Among Adolescents.


ABSTRACT

Importance  Although smoking and secondhand smoke exposure are associated with sensorineural hearing loss (SNHL) in children and adults, the possible association between prenatal smoke exposure and hearing loss has not been investigated despite the fact that more than 12% of US children experience such prenatal exposure each year.

Objective  To investigate whether exposure to prenatal tobacco smoke is independently associated with SNHL in adolescents.

Design  Cross-sectional data were examined for 964 adolescents aged 12 to 15 years from the National Health and Nutrition Examination Survey 2005-2006.

Participants  Participants underwent standardized audiometric testing, and serum cotinine levels and self-reports were used to identify adolescents exposed to secondhand smoke or active smokers.

Main Outcomes and Measures  Prenatal exposure was defined as an affirmative parental response to, “Did [Sample Person’s Name] biological mother smoke at any time while she was pregnant with [him/her]?” Sensorineural hearing loss was defined as an average pure-tone hearing level more than 15 dB for 0.5, 1, and 2 kHz (low frequency) and 3, 4, 6, and 8 kHz (high frequency).

Results  Parental responses affirmed prenatal smoke exposure in 16.2% of 964 adolescents. Prenatal smoke exposure was associated with elevated pure-tone hearing thresholds at 2 and 6 kHz (P < .05), a higher rate of unilateral low-frequency SNHL (17.6% vs 7.1%; P < .05) in bivariate analyses, and a 2.6-fold increased odds of having unilateral low-frequency SNHL in multivariate analyses (95% CI, 1.1-6.4) after controlling for multiple hearing-related covariates.

Conclusions and Relevance  Prenatal smoke exposure is independently associated with higher pure-tone hearing thresholds and an almost 3-fold increase in the odds of unilateral low-frequency hearing loss among adolescents. These novel findings suggest that in utero exposure to tobacco smoke may be injurious to the auditory system.

Source: JAMA

7 Important Lessons Deaf People Can Teach You About Communication.


LESSON

I have always thought it would be a blessing if each person could be blind and deaf for a few days during his early adult life. Darkness would make him appreciate sight; silence would teach him the joys of sound. ~Helen Keller ( Blind and Deaf American Author and Educator)

I grew up with wonderful parents who always encouraged my passion for music. I still vividly remember the day when they got me the new shiny sound system. Years later they got me a guitar and paid for my guitar classes. And they never had a chance to hear a sound of what I was listening to, playing and singing. My parents are deaf.

The reality of deaf people is different from other people’s experiences. They have limited abilities to communicate but exactly because of that they seem to know so much more about what effective communication means.

I used to live in a dormitory for deaf families for over 10 years and had a chance to compare 2 worlds: at home, where I saw people communicating using their hands, and outside, where I observed interactions of ‘normal’ people with hearing abilities. I was very blessed to have experienced the best and the worst of both worlds: the world of silence and the world of sounds.

These are some of the things I’ve learned from deaf people about effective communication:

1. Maintain eye contact

How many times did you find yourself checking facebook updates on your iPhone while having a conversation? In the world of deaf people if you stop looking at the person you are talking to, you are literally cutting the conversation. Because the only way you can ‘hear’ what other person is trying to say is to look into their face. This is a great lesson on the importance of being present, focusing on the person who is next to you, staying more connected to that person and receiving.

2. Don’t interrupt, follow the protocol

How many times did you find yourself waiting for someone to finish talking so you can say what you think? When a company of deaf people are having a conversation, it’s not possible for them to have more than one person talking at a time. There is only one way to follow the conversation – to look at the one speaking. This teaches us to respect the right of each individual to speak up and not to be interrupted in the midst of the their self expression.

3. Be straightforward, down to the point and as concise as possible

How often do you communicate your thoughts and needs clearly without trying to make things sound better than they are? In sign language there are 2 ways to say a particular word – you either use the alphabet and show a sign for each letter or you use one sign which stands for the entire word.

The second option is much faster hence convenient. Thus for almost every word there is a specific sign. Can you imagine such a massive amount of information to memorize? Not only you have to learn how to write and pronounce the word but also a specific sign that represents it. The nature of sign language requires you to be as specific as possible and use as few words as needed to convey your message. That’s an essential lesson to learn as so often we are reluctant to be direct and clear in what we think, want and feel.

4. If you don’t understand something, ask

How often are you reluctant to ask a question when something is unclear to you? Or to clarify what your loved one meant rather than making an assumption? We do it out of fear of being misunderstood, rejected or even humiliated. Each deaf person has their own style of using sign language. So it’s normal to ask a meaning of a specific unfamiliar sign. There is nothing wrong in not knowing or understanding something. If that happens, just ask.

5. Cut yourself from distractions

The world around us is extremely noisy. We have tons of devices, social medias, traditional medias which in their attempt to inform, entertain, update and educate, produce an overwhelming informational noise around us. We hear, see and feel. We are so used to being surrounded by that noise that we lose our ability to be focused and present. When we are having a conversation. When we are working. When we are cooking. When we are creating something. We are constantly attacked and distracted by that informational noise. I remember watching my father making furniture. He would always be so focused and immersed in the moment of creating, it would seem like nothing in the world could disturb him. Learn to be present – as simple as that.

6. Be expressive and articulate

There are so many ways we can play with our voice when we talk: pace, tone, volume. All this gives us plenty of ways to express our emotions, feelings and attitude when we talk about the particular subject. But how often do we allow ourselves to be expressive? Sometimes so called social norms restrict us from laughing too loud, from raising our voice when we are excited or crying in front of others. Because it’s an inappropriate thing to do. Deaf people are very articulate by nature. Their facial expressions and gestures can mesmerize you with their intensity and artistry. They don’t really care how others may see them. They just express what they feel without actually hiding or softening their emotions.

7. Observe, learn and get extra information from what you see and feel

Just imagine how many tiny yet important details we usually miss in our daily interactions with others? When you cannot hear you become more attentive to things happening around you. You learn to notice even the smallest things, you learn to experience the world around you through all those insignificant details which in a bigger picture play their crucial role. And more importantly, you learn to appreciate them.

Source: purpose Fairy