VietBF

VietBF (https://www.vietbf.com/forum/index.php)
-   School | Kiến thức 2006-2019 (https://www.vietbf.com/forum/forumdisplay.php?f=273)
-   -   Your's Health (https://www.vietbf.com/forum/showthread.php?t=1234581)

florida80 05-18-2019 18:05

News Release 17-May-2019

Integrated stepped alcohol treatment for people in HIV care improves both HIV & alcohol outcomes


NIH/National Institute on Alcohol Abuse and Alcoholism


   


New clinical research supported by the National Institutes of Health shows that increasing the intensity of treatment for alcohol use disorder (AUD) over time improves alcohol-related outcomes among people with HIV. This stepped approach to AUD treatment also improves HIV-related disease measures in this patient population. A report of the new study, led by researchers at Yale University, is now online in The Lancet HIV.

"These research findings demonstrate the potential of integrated treatment for AUD and HIV in improving health outcomes," said George F. Koob, Ph.D., director of the NIH's National Institute on Alcohol Abuse and Alcoholism (NIAAA), which provided primary funding for the new research, with additional funding provided by the National Institute on Drug Abuse (NIDA). "Moreover, it underscores the importance of integrating treatment for alcohol problems into mainstream health care."

In the United States, estimates of the prevalence of people with HIV who either drink heavily, or who have AUD, range from 8 percent to 42 percent. Alcohol misuse can increase risky behaviors that increase the likelihood of acquiring HIV or transmitting it to others. Alcohol misuse may also speed the progression of HIV in individuals with HIV infection and make it harder to follow medication regimens.

"Many people with HIV are unaware of, or not seeking treatment for, their alcohol use problems," said first author E. Jennifer Edelman, M.D., M.H.S., associate professor of medicine at Yale School of Medicine. "In addition, HIV clinicians often do not realize that there are effective medications and counseling that they can easily integrate into their practice for patients with alcohol use problems."

Noting that previous studies have found that integrating the treatment of opioid use disorder into HIV clinics improves both HIV and substance-related outcomes, the researchers wanted to evaluate whether such a model would similarly benefit people with HIV and AUD.

Treatment for AUD often occurs apart from an individual's HIV clinical care. The current study integrates the treatment for AUD with treatment for HIV.

Dr. Edelman and her colleagues conducted a randomized clinical trial in five Veterans Affairs-based HIV clinics with 128 people who had HIV infection and AUD. The researchers investigated integrated stepped alcohol treatment (ISAT) -- an approach that involved consecutive steps of increased AUD treatment intensity if lower intensity treatment did not produce desired results.

People in the ISAT group started their AUD treatment with an on-site addiction psychiatrist, focusing on the use of medications for AUD. If that step did not stop heavy drinking, the next step included the addition of a behavioral intervention conducted on-site to boost motivation to change drinking behavior and teach coping skills for managing high-risk situations. Researchers defined heavy drinking as five drinks or more per day for men and four drinks or more per day for women, on one or more days during the previous 14 days. Patients who continued to engage in heavy drinking were advanced to the final step of referral to specialty addiction treatment -- such as intensive outpatient treatment or residential treatment depending on locally available resources. Patients in the control group received treatment as usual - which included alcohol screening, brief intervention, and referral to specialty addiction treatment at the VA at the discretion of their HIV clinician.

At the end of the six-month study, while both groups reported reduced alcohol intake, the researchers found no differences in drinks per week or HIV outcomes between the ISAT and control groups. Both groups then continued AUD treatment under treatment as usual (control) conditions. At the 12-month follow up, individuals who had initially received ISAT were found to have fared better than individuals who only received treatment as usual. People in the ISAT group, for example, reported having fewer drinks per drinking day than people in the control group and a greater percentage of days abstinent. The ISAT group also had a higher percentage of people who reported no heavy drinking days.

"Importantly, we also observed that participants randomized to stepped AUD treatment were more likely to achieve an undetectable HIV viral load," said Dr. Edelman. "We believe that with decreased alcohol consumption, participants in the ISAT group were more likely to take their HIV medications consistently, translating into improved HIV viral control."

In an invited commentary on the new research in The Lancet HIV, co-authors Lorenzo Leggio, M.D., Ph.D., Senior Investigator in the NIH Intramural Research Program at NIAAA and NIDA, and Roberta Agabio, M.D., a psychiatrist at the University of Cagliari in Italy welcomed the new findings as important for the HIV field and beyond.

"Stepped care approaches have been found to be effective for treating a variety of chronic diseases," said Dr. Leggio.

"These findings are a first indication of their potential value for treating AUD in the context of HIV treatment. The results warrant further investigation on how to optimize its use among people with HIV, and to explore its integration in other medical care settings. Indeed, the study is a compelling example of the need for trained clinicians across the spectrum of health care to recognize and treat AUD as a medical disorder amenable to a variety of treatment approaches."

florida80 05-18-2019 18:06

News Release 17-May-2019

'Stepped' treatment reduces drinking in patients with HIV


Yale University


   


New Haven, Conn. -- People with HIV who drink too much were more likely to reduce drinking after undergoing an approach to care known as integrated stepped alcohol treatment, according to a Yale-led study.

The finding supports greater use of this treatment model in HIV clinics to improve outcomes for patients with both HIV and drinking problems, the researchers said.

The study was published in The Lancet HIV.

Stepped care is used to treat some patients with chronic diseases such as hypertension and depression. It entails the use of different treatments that are "stepped up," or increased in intensity over time, in response to patients' needs. Prior to this new study, little research had been done to evaluate the impact of stepped care for patients struggling with alcohol use disorder, and none had been conducted in HIV treatment settings, the researchers said.

The research team recruited 128 individuals from one of five Veterans Affairs-based HIV clinics. They randomized the patients into one of two groups -- those given integrated stepped alcohol treatment and an equal number receiving treatment as usual.

The stepped-care patients were offered evidence-based treatments, including medication, motivational therapy, and specialty care at either an outpatient or residential treatment facility. By comparison, the treatment-as-usual patients were referred to specialty addiction treatment at the VA at the discretion of their HIV clinician.

At the end of the study period, the researchers found that patients who received integrated stepped care fared better overall. After 52 weeks, stepped-care patients had fewer heavy drinking days, drank less per drinking day, and had more days of abstinence, the researchers noted.

"We saw overall improvements in drinking," said Jennifer Edelman, M.D., lead author and associate professor in internal medicine. "We also found improved HIV outcomes at the 52-week mark."

The improvements in patients' HIV status were presumably associated with the reduced alcohol use, Edelman noted. "Over time, the patients receiving integrated stepped care showed decreases in alcohol use and a higher rate of undetectable HIV viral load, likely related to improved HIV medication adherence," she said.

The study results support the expanded use of integrated stepped care for alcohol misuse in settings where patients are already being treated for HIV, the researchers said

florida80 05-18-2019 18:07

News Release 17-May-2019

Clinical trial at IU School of Medicine improves treatment of genetic rickets


Indiana University


   






IMAGE: Erik Imel, MD view more 

Credit: Indiana University School of Medicine

A new study shows a drug developed in conjunction with investigators at Indiana University School of Medicine to alleviate symptoms of a rare musculoskeletal condition is significantly more effective than conventional therapies. The findings are published in Lancet.

X-linked hypophosphatemia, or XLH, is a phosphate-wasting disease that causes rickets and osteomalacia, or softening of the bones, and can cause short stature, bowed legs, dental abscesses and bone pain. This rare, genetic disease affects about 1 in every 20,000 people.

Researchers recruited 61 children between the ages of 1 and 12 at 16 centers around the world, including the U.S., Canada, the United Kingdom, Sweden, Australia, Japan and Korea. The children were randomly assigned to either receive Burosumab, a biweekly injection that was approved by the Food and Drug Administration in April 2018, or conventional therapies of taking oral phosphate and active vitamin D several times a day. The primary outcome was improvement in rickets on X-rays, as scored by radiologists that were unaware of which treatment group the participant was in.

The children were observed for 64 weeks, and by 40 weeks of treatment, researchers found 72 percent of the children who received Burosumab achieved substantial healing of rickets, while only 6 percent of those in the conventional therapy group saw substantial healing. Burosumab also led to greater improvements in leg deformities, growth, distance walked in a 6-minute test and serum phosphorus and active vitamin D levels.

"This is the first study comparing Burosumab head-to-head with conventional therapy," said lead investigator Erik Imel, MD, associate professor of medicine at IU School of Medicine. "We now know the magnitude of benefit from Burosumab over the prior approach with conventional therapy. This information is critical for doctors to make treatment decisions for patients with XLH."

Researchers plan to continue studying the long-term effects of Burosumab, including the effect treating children has on height outcomes as an adult and whether this treatment will decrease the need for surgeries to correct bowed legs.

Burosumab blocks a protein called fibroblast growth factor 23 that was originally discovered by investigators at Indiana University School of Medicine. Burosumab is marketed by Ultragenyx Pharmaceutical, Inc. in collaboration with Kyowa Hakko Kirin Co., Ltd. and its European subsidiary, Kyowa Kirin International PLC, under the brand name Crysvita

florida80 05-18-2019 18:08

News Release 17-May-2019

IU researchers develop electric field-based dressing to help heal wound infections


Indiana University


   

Credit: Indiana University School of Medicine

Researchers at Indiana University School of Medicine have found a way to charge up the fight against bacterial infections using electricity.

Work conducted in the laboratories of the Indiana Center for Regenerative Medicine and Engineering, Chandan Sen, PhD and Sashwati Roy, PhD has led to the development of a dressing that uses an electric field to disrupt biofilm infection. Their findings were recently published in the high-impact journal "Annals of Surgery."

Bacterial biofilms are thin, slimy films of bacteria that form on some wounds, including burns or post-surgical infections, as well as after a medical device, such as a catheter, is placed in the body. These bacteria generate their own electricity, using their own electric fields to communicate and form the biofilm, which makes them more hostile and difficult to treat. The Centers for Disease Control and Prevention estimates 65 percent of all infections are caused by bacteria with this biofilm phenotype, while the National Institutes of Health estimates that number is closer to 80 percent.

Researchers at IU School of Medicine are the first to study the practice of using an electric field-based dressing to treat biofilms rather than antibiotics. They discovered the dressing is not only successful in fighting the bacteria on its own, but when combined with other medications can make them even more effective. This discovery has the potential to create significant changes in the way physicians treat patients with bacterial infections which are resistant to antibiotics. The dressing can also help prevent new biofilm infections from forming in the future. The dressing electrochemically self-generates 1 volt of electricity upon contact with body fluids such as wound fluid or blood, which is not enough to hurt or electrocute the patient.

"This shows for the first time that bacterial biofilm can be disrupted by using an electroceutical dressing," said Chandan Sen, PhD, director of the Indiana Center for Regenerative Medicine and Engineering and associate vice president of research for the IU School of Medicine Department of Surgery. "This has implications across surgery as biofilm presence can lead to many complications in successful surgical outcomes. Such textile may be considered for serving as hospital fabric - a major source of hospital acquired infections"

Marketing of the dressing for burn care was recently approved by the Food and Drug Administration. The team is now studying the device's effectiveness in patients recovering from burns.

florida80 05-18-2019 18:09

Enzyme may indicate predisposition to cardiovascular disease

Study suggests that people with low levels of PDIA1 in blood plasma may be at high risk of thrombosis; this group also investigated PDIA1's specific interactions in cancer

Fundação de Amparo à Pesquisa do Estado de São Paulo


  


Measuring the blood plasma levels of an enzyme called PDIA1 could one day become a method of diagnosing a person's predisposition to cardiovascular disease even if they are healthy, i.e., not obese, diabetic or a smoker, and with normal cholesterol.

This is suggested by a study published in the journal Redox Biology by Brazilian researchers affiliated with the University of São Paulo (USP), the University of Campinas (UNICAMP) and Butantan Institute.

The investigation was conducted under the aegis of the Center for Research on Redox Processes in Biomedicine (Redoxome), one of the Research, Innovation and Dissemination Centers (RIDCs) funded by the São Paulo Research Foundation (FAPESP). Redoxome is hosted by USP's Chemistry Institute.

"This molecule belongs to the protein disulfide isomerase [PDI] family. Our study showed that people with low plasma levels of PDIA1 have a more inflammatory protein profile and hence run a higher risk of thrombosis. On the other hand, people with high levels of PDIA1 have more 'housekeeping' proteins associated with cell adhesion, homeostasis and the organism's normal functioning," said Francisco Rafael Martins Laurindo, a professor at the University of São Paulo's Medical School (FM-USP) and principal investigator for the study.

The study was conducted during the PhD research of Percíllia Victória Santos de Oliveira with a scholarship from FAPESP.

The group analyzed blood plasma samples from 35 healthy volunteers with no history of chronic or acute disease. None was a smoker or a user of recreational drugs or chronic medication.

Plasma was collected 10-15 times at intervals of days or weeks during a period of 10-15 months. Circulating PDI levels were within a small range for most individuals. Moreover, in a cohort of five individuals, PDIA1 levels were measured three times in a nine-hour period. The variability of the results was again negligible.

"However, the measurements showed that some patients had high levels of PDIA1, while the levels were very low, almost undetectable, in others. When the tests were repeated for the same person over time, these values hardly varied at all," said Laurindo, who heads the Translational Cardiovascular Biology Laboratory at the Heart Institute (InCor) attached to FM-USP's teaching and general hospital (Hospital das Clínicas).

The researchers also measured the levels of PDIA1 in 90 plasma bank samples from patients with chronic cardiovascular disease. The analysis consistently showed low levels of the enzyme.

They then conducted several additional proteomic studies to investigate how the plasma levels of PDIA1 correlated with an individual's protein signature. The adhesion and migration of cultured vein endothelial cells treated with PDIA1-poor plasma were impaired in comparison with those of cells treated with PDIA1-rich plasma.

These results led to the hypothesis that the plasma level of PDIA1 could be a window onto individual plasma protein signatures associated with endothelial function, which could indicate a possible predisposition to cardiovascular disease.

The study also showed no correlation between PDIA1 levels and well-known risk factors for cardiovascular disease, such as triglycerides and cholesterol.

The next steps for the research group include studying PDIA1 levels in conditions such as acute coronary disease, as well as other members of the protein disulfide isomerase family (there are more than 20 PDIs all told), to compare results and confirm whether all these enzymes are potential markers of vulnerability to cardiovascular disease.

Inhibitors

Clinical trials of inhibitors of other PDIs are being conducted by different groups of researchers in several parts of the world. Because these enzymes play various essential roles in cell survival, Laurindo explained, it is important to understand their specific interactions in the cancer context to design inhibitors capable of eliminating tumors with a minimum of toxicity to normal cells.

In another study, published in the American Journal of Physiology-Heart and Circulatory Physiology, the researchers used an antibody to inhibit PDIA1 on the surface of vascular cells and observed the effects of stimulation with several different mechanical forces, such as stretching and alterations to the rigidity of the extracellular matrix.

Resulting from research conducted during Leonardo Yuji Tanaka's postdoctoral internship (https:/​/​bv.​fapesp.​br/​en/​pesquisador/​77430/​leonardo-yuji-tanaka) with support from FAPESP (https:/​/​bv.​fapesp.​br/​en/​bolsas/​151090), the study concluded that surface PDIA1 inhibition affected the cytoskeleton, an intracellular framework of filaments, thereby hindering cell migration.

"PDIA1 is fundamental for the ability of cells to migrate within the organism, and so it mustn't be completely inhibited. When the surface portion, which corresponds to less than 2% of total PDIA1, is silenced, the cell survives but loses fine regulation of cell direction during migration. This can be leveraged in the search for new disease mechanisms and drugs," Laurindo explained.

###

florida80 05-18-2019 18:10

Dangerous pathogens use this sophisticated machinery to infect hosts

A detailed new model of a bacterial secretion system provides directions for developing precisely targeted antibiotics

California Institute of Technology


    Share

 Print  E-Mail



IMAGE


IMAGE: A structural model of a dynamic machine, called a Type IV secretion system, used by many bacteria in order to inject toxic molecules into cells and also to spread genes... view more 

Credit: Caltech

Gastric cancer, Q fever, Legionnaires' disease, whooping cough--though the infectious bacteria that cause these dangerous diseases are each different, they all utilize the same molecular machinery to infect human cells. Bacteria use this machinery, called a Type IV secretion system (T4SS), to inject toxic molecules into cells and also to spread genes for antibiotic resistance to fellow bacteria. Now, researchers at Caltech have revealed the 3D molecular architecture of the T4SS from the human pathogen Legionella pneumophila with unprecedented details. This could in the future enable the development of precisely targeted antibiotics for the aforementioned diseases.

The work was done in the laboratory of Grant Jensen, professor of biophysics and biology and Howard Hughes Medical Institute investigator, in collaboration with the laboratory of Joseph Vogel at the Washington University School of Medicine in St. Louis (WUSTL). A paper describing the research appeared online on April 22 in the journal Nature Microbiology.

There are nine different types of bacterial secretion systems, Type IV being the most elaborate and versatile. A T4SS can ferry a wide variety of toxic molecules--up to 300 at once--from a bacterium into its cellular victim, hijacking cellular functions and overwhelming the cell's defenses.

In 2017, Caltech postdoctoral scholar Debnath Ghosal and his collaborators used a technique called electron cryotomography to reveal, for the first time, the overall low-resolution architecture of the T4SS in Legionella, the bacteria that causes Legionnaires' disease, a severe and often lethal form of pneumonia.

Ghosal, along with Kwangcheol Jeong of WUSTL and their colleagues, have now made a detailed structural model of this dynamic multi-component machine. The team also made precise perturbations to the bacterium's genes to study mutant versions of the T4SS, revealing how this complex machine organizes and assembles.

The model revealed that the secretion system is composed of a distinct chamber and a long channel, like the chamber and barrel of a gun. Characterizing these and other components of the T4SS could enable the development of precisely targeted antibiotics.

Current antibiotics act broadly and wipe out bacteria throughout the body, including the beneficial microorganisms that live in our gut. In the future, antibiotics could be designed to block only the toxin delivery systems (such as the T4SS) of harmful pathogens, rendering the bacteria inert and harmless without perturbing the body's so-called "good bacteria."

###

The paper is titled "Molecular architecture, polar targeting and biogenesis of the Legionella Dot/Icm T4SS." Ghosal and Jeong are co-first authors. In addition to Jensen and Vogel, other co-authors are former Caltech postdoctoral scholar Yi-Wei Chang, now of the University of Pennsylvania; Jacob Gyore of WUSTL; Lin Teng of the University of Florida; and Adam Gardner of the Scripps Research Institute. The work was funded by the National Institutes of Health.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

florida80 05-18-2019 18:11

News Release 17-May-2019

A new approach to targeting cancer cells

UC Riverside researchers develop new drugs that target therapeutically relevant protein surfaces

University of California - Riverside


    Share

 Print  E-Mail



IMAGE


IMAGE: The Pellecchia lab at UC Riverside is working on an approach in which effective drugs can be obtained by appropriately designing molecules to contain an aryl-fluoro sulfate (the chemical group... view more 

Credit: Pellecchia lab, UC Riverside.

RIVERSIDE, Calif. -- A University of California, Riverside, research team has come up with a new approach to targeting cancer cells that circumvents a challenge faced by currently available cancer drugs.

A cancer target is often a rogue protein that signals cancer cells to proliferate uncontrollably and invade organs. Modern cancer drugs have emerged that work by striking a tight bond between the drug and a particular amino acid called cysteine, one of the 20 natural amino acids that constitute our proteins. Cysteine is unique in that it can react with specific organic functional groups to form a strong molecular bond.

Only a few new cancer drugs that target cysteine have been recently approved by the Food and Drug Administration, or FDA. A challenge cancer researchers face is that cysteine is rarely found within binding sites of cancer targets, limiting the application of this approach to only a few drug targets.

The UC Riverside research team has now met this challenge by exploring the development of drugs that target other potentially reactive amino acids, such as lysine, tyrosine, or histidine, which occur more often within the binding site of the target.

The researchers also addressed another challenge: The target they used for proof of concept was a protein-protein interaction, or PPI, target. PPIs represent a large class of possible therapeutic targets for which designing effective drugs is particularly difficult. This is because PPIs lack a well-defined and deep-binding pocket onto which drugs can de designed to bind tightly.

"To date, there is only one drug approved by the FDA that was designed to antagonize -- or block -- a PPI target," said Maurizio Pellecchia, a professor of biomedical sciences in the School of Medicine, who led the research. "Only a few others have entered clinical trials. Our approach provides novel and effective avenues to derive potent and selective PPI antagonists by designing drugs that can react with lysine, tyrosine, or histidine residues that are ubiquitously present at binding interfaces of PPIs."

Study results appear in the Journal of Medicinal Chemistry.

Pellecchia, who holds the Daniel Hays Chair in Cancer Research at UCR, explained that academic researchers, the biotechnology industry, and pharmaceutical companies are heavily pursuing the design of "covalent drugs" that bind irreversibly with their targets. Those that target cancer cells most often target cysteine because it is more reactive than all other amino acids in a protein target. Oncology drugs such as Osimertinib, Ibrutinib, Neratinib, and Afatinib have all been approved in very recent years by the FDA, he said, and all target a cysteine that is present on the binding site of their respective targets.

"Our work widens the available target space beyond cysteine," he added. "Such covalent agents could represent significant stepping stones in the development of novel drug candidates against PPIs, which represent an untapped large class of therapeutic targets not only in oncology, but also in other conditions including neurodegenerative and inflammatory diseases."

###

florida80 05-18-2019 18:11

News Release 17-May-2019

Percutaneous edge-to-edge repair in patients with heart failure and secondary mitral regurgitation

A PCR statement on behalf of PCR and the EAPCI

PCR


    Share

 Print  E-Mail


Paris, France, 21 May 2019. Heart failure is a common cardiovascular disorder with ominous prognosis despite significant therapeutic advances. Mitral regurgitation (MR, leaking of the mitral valve within the heart) affects at least 50% of patients with heart failure and is independently associated with worse prognosis. Timely diagnosis is essential, and management is complex, requiring an expert approach. These patients should be referred for assessment and management by a multidisciplinary Heart Team.

Detailed imaging assessment is essential to confirm the diagnosis and severity of MR and provide further insights into cardiac anatomy and function that will guide the choice of management.

The first essential step in management is the institution of optimised pharmacological therapy and use of cardiac resynchronisation devices according to guideline recommendations. Recent evidence supports the use of transcatheter mitral edge-to-edge repair using the MitraClip device in carefully selected patients with heart failure-associated MR who remain symptomatic despite these measures.

Surgical treatment of heart failure-associated MR should be considered in patients with coronary artery disease undergoing surgical revascularisation. Circulatory support devices and cardiac transplantation are an alternative in patients with extreme left and/or right ventricular failure and no accompanying severe comorbidity.

Expensive, high-risk and ultimately futile procedures should be avoided in patients who will derive little or no symptomatic benefit or quality of life improvement. Specialist palliative care should be available for these patients.

This summary is based upon a published Viewpoint article [1] and underpins a more extensive joint position statement in preparation by a collaborative working group derived from the European Association of Percutaneous Cardiovascular Intervention (EAPCI), European Heart Rhythm Association (EHRA), European Association of Cardiovascular Imaging (EACVI) and Heart Failure Association (HFA) of the European Society of Cardiology.

florida80 05-18-2019 18:12

Nivolumab with ipilimumab: Combination has added benefit in advanced renal cell carcinoma

Advantages in overall survival are not offset by serious disadvantages

Institute for Quality and Efficiency in Health Care


    Share

 Print  E-Mail


Renal cell carcinoma is one of the cancer diseases for which the range of promising treatment options has become considerably wider in recent years. In several early benefit assessments since 2013, the German Institute for Quality and Efficiency in Health Care (IQWiG) has already been able to determine an added benefit of a new drug in comparison with the respective appropriate comparator therapy (ACT).

The Institute has also come to a positive conclusion in its current assessment of nivolumab with ipilimumab: The drug combination has considerable added benefit in comparison with the ACT (sunitinib) for patients with advanced renal cell carcinoma and an intermediate risk score, and even major added benefit for patients with at least three risk factors and a corresponding unfavourable prognosis.

Antibodies as multipurpose weapons

In recent years, both monoclonal antibodies have already been subject of several early benefit assessments because, alone or in combination, they are also used against other types of cancer, e.g. melanoma, squamous cell carcinoma of the lung and non-small cell lung cancer. As so-called checkpoint inhibitors, they block different molecules on the outside of immune cells.

Nivolumab binds to the PD-1 receptors on T lymphocytes and thus prevents their defence-inhibiting effect. Ipilimumab, in contrast, weakens the inhibitory effect of the CTLA-4 molecules on the T lymphocytes. Both increase proliferation and activity of the immune cells, so that they can fight the tumour cells more vigorously.

Premature end of study after interim analysis

The data for both corresponding dossier assessments - in each case of one of the drugs in combination with the other - are from the randomized controlled trial CheckMate 214. The study was ended prematurely after the first interim analysis because the results on overall survival were clearly in favour of the combination.

For patients with an intermediate risk score, the study data showed statistically significant and clinically relevant advantages in overall survival, symptoms and health-related quality of life. With both positive and negative effects of approximately the same magnitude regarding side effects, the conclusion for this outcome category was: Greater or lesser harm is not proven. Overall, there was an indication of a considerable added benefit.

The survival advantage was even more pronounced in patients with higher risk scores. In addition, there were hints of lesser harm of different extent for a number of side effects. This was offset by hints of greater harm in three side effects. However, these did not outweigh the advantages, so that the Institute sees an overall indication of a major added benefit.

G-BA decides on the extent of added benefit

Both dossier assessments are part of the early benefit assessment according to the Act on the Reform of the Market for Medicinal Products (AMNOG) supervised by the Federal Joint Committee (G-BA). After their publication, the G-BA conducts a commenting procedure and makes a final decision on the extent of the added benefit.

More English-language information will be available soon (extracts of the dossier assessments as well as easily understandable information on informedhealth.org). The website http://www.​gesundheitsinformation.​de, published by IQWiG, provides easily understandable German-language information.

florida80 05-18-2019 18:14

PCR statement on evolving indications for transcatheter aortic valve implantation

Aortic stenosis, surgical aortic valve replacement, transcatheter aortic valve implantation

PCR


    Share

 Print  E-Mail


Paris, France, 21 May 2019. Severe symptomatic aortic stenosis, a degenerative disease-causing calcification and immobility of the aortic valve leaflets leading to left ventricular outflow obstruction, is the most common valve lesion leading to intervention in Europe and the USA. Symptoms include heart failure, syncope and angina, and have a dismal prognosis if left untreated. There is no effective medical treatment, except for replacement of the diseased valve by means of surgical aortic valve replacement (SAVR) or transcatheter aortic valve implantation (TAVI), but for which indications

Aortic stenosis is in most cases a degenerative disease-causing calcification and immobility of the aortic valve leaflets that in turn leads to left ventricular outflow obstruction. Severe, symptomatic aortic stenosis is the most common heart valve lesion leading to intervention in Europe and the USA. Affected patients suffer from symptoms including heart failure, syncope and angina and have a dismal prognosis if left untreated. There is no effective medical treatment, and the only intervention is replacement of the diseased valve by means of surgical aortic valve replacement (SAVR) or transcatheter aortic valve implantation (TAVI). The latter less invasive treatment has resulted in an unprecedented effort to conduct a series of high quality randomised clinical trials (RCTs) in a field devoid of randomised evidence prior to the advent of TAVI.

Previously accumulated evidence among patients with severe aortic stenosis, who are at extreme (inoperable), high- and intermediate-surgical risk were summarized in the 2017 ESC/EACTS Guidelines on the management of valvular heart disease (Baumgartner H et al. EHJ 2017, https:/​/​doi.​org/​10.​1093/​eurheartj/​ehx391.). In these guidelines, TAVI was recommended as therapy of choice among patients at extreme-risk (inoperable) and as treatment alternative to SAVR among patients at increased surgical risk with the decision made by the Heart Team according to individual patient characteristics, and TAVI being favoured in elderly patients suitable for transfemoral access, while SAVR remained the standard of care among patients at low surgical risk.

Recently, the results of two additional RCTs comparing TAVI with SAVR extend the available evidence to patients at low surgical risk. In the PARTNER 3 trial, TAVI with use of the balloon-expandable Edwards SAPIEN S3 prosthesis compared with SAVR was associated with a lower risk of the primary composite endpoint of death, stroke and rehospitalisation at 1 year among low surgical risk patients (mean age 74±6 years, STS score 1.9±0.7%) (Mack M et al. NEJM 2019, https:/​/​doi.​org/​10.​1056/​NEJMoa1814052. ). In the EVOLUT Low Risk trial, TAVI with use of the self-expanding supra-annular CoreValve/Evolut prosthesis compared with SAVR was non-inferior with respect to the primary composite endpoint of death and disabling stroke and superior for heart failure rehospitalisation at 2 years among patients at low surgical risk (mean age 74±6 years, STS score 1.9±0.7%) (Popma J et al. NEJM 2019, https:/​/​doi.​org/​10.​1056/​NEJMoa1816885. ). These trials included patients, who were anatomically good candidates for TAVI.

An updated meta-analysis of seven RCTs comparing TAVI and SAVR among 8,020 patients with severe, symptomatic aortic stenosis reported a lower risk of all-cause mortality (12% relative risk reduction) and stroke (19% relative risk reduction), irrespective of underlying surgical risk throughout two years of follow-up (Siontis G et al. EHJ 2019, https:/​/​doi.​org/​10.​1093/​eurheartj/​ehz275.).

In aggregate, the available evidence from the low surgical risk trials and meta-analyses can be summarised as follows:


•TAVI is superior to SAVR at two years with respect to patient-oriented endpoints including: ◦ death
◦ dtroke
◦ rehospitalisation


•TAVI is associated with improved health care resource utilisation owing to: ◦ shorter interventions without need for general anaesthesia, cardiopulmonary bypass and intensive care unit monitoring
◦ shorter hospitalisation duration
◦ reduced need for rehabilitation services
◦ faster recovery and more rapid restoration of daily life activities and quality of life
◦ notwithstanding, the cost-effectiveness of TAVI requires further study in view of the current cost of transcatheter heart valves



These findings constitute a paradigm shift that will affect the care of patients with severe, symptomatic aortic stenosis in several ways:

• The favourable outcomes of TAVI are consistent across the entire risk spectrum suggesting that surgical risk estimation is no longer the basis to guide the choice between TAVI and SAVR.
• Instead, the Heart Team will weigh clinical and anatomic characteristics to identify the best treatment option for individual patients with transfemoral TAVI replacing SAVR as default therapy in more patients.
• Prosthetic valve selection will be determined by life expectancy and durability with surgically implanted mechanical valve prostheses being favoured in younger patients (<50 years of age), and bioprotheses (TAVI or SAVR) being favoured in older individuals (>65 years of age).


Future research will need to address remaining uncertainties and options for further improvement in outcomes:

• evaluation of TAVI in younger patients (<70 years)
• assessment of long-term durability using predefined clinical and echocardiographic assessment
• evaluation of TAVI in patients with bicuspid aortic valve disease
• evaluation of TAVI in patients with concomitant coronary artery disease
• continued measures to reduce the need for permanent pacemaker implantation
• definition of the optimal short and long-term antithrombotic treatment therapy
• TAVI in asymptomatic patients with severe aortic stenosis


###

florida80 05-18-2019 18:15

News Release 17-May-2019

Early dengue virus infection could 'defuse' Zika virus


German Center for Infection Research


    Share

 Print  E-Mail


"We now know for sure that Zika virus infection during pregnancy can affect the unborn foetus in such a way that the child develops microcephaly and other severe symptoms," explains Prof Felix Drexler, a virologist at the Charité who has been developing diagnostic tests for Zika and other viruses at the DZIF. Just a few years ago, pictures of affected new-borns were cause for worldwide dismay and perplexity. "However, what we did not understand then was that high incidence of microcephaly seemed to occur particularly in northeastern Brazil," says Drexler. Why are expecting mothers in these regions at a higher risk of developing a severe Zika-associated disease than in other regions? The scientists consequently began to search for cofactors that have an influence on whether a Zika infection during pregnancy will develop fatal consequences or not.

A suspected cofactor

Dengue viruses, which are widespread in Latin America and cause dengue fever, were suspected cofactors. Initially, the scientists suspected that the antibodies humans produce against the dengue virus contribute to the foetal damage caused in later Zika infection. It has been known for a long time that these antibodies can enhance subsequent dengue infections under certain conditions.

However, in the case of Zika, the opposite seems to be the case. "Surprisingly, our study has shown that a previous dengue infection can protect against Zika-associated damage," emphasizes Drexler.

The study

As a first step to investigating the interactions between dengue and Zika viruses, the genomes of all known dengue viruses in Brazil were compared to each other. This was to enable the researchers to find out whether perhaps dengue viruses in northeastern Brazil had caused different immunity compared to the immunity observed in other regions in Brazil over the last decades. In addition, the scientists conducted extensive serological tests in Salvador, Brazil: Samples from a case-control study were tested for antibodies against four different dengue serotypes. Samples from 29 mothers who had undergone Zika infection during pregnancy and gave birth to children with microcephaly were investigated. Samples from 108 mothers who also had undergone Zika infection during pregnancy but gave birth to healthy children were used as controls. In this project, scientists from the Charité - Universitätsmedizin Berlin collaborated closely with the Federal University of Bahia and the Institute of Virology of the Bonn University Medical Centre.

Cofactor becomes a protective factor

The study showed that an existing immunity against dengue virus significantly reduces the risk of Zika-associated microcephaly in newly borns. "We can now say that people who have had early infections with dengue do not need to worry much about contracting more severe forms of Zika infection due to this," summarises Drexler.

This is an important message for pregnant women.

Consequently, it could not be confirmed that the dengue virus acts as a cofactor for congenital Zika infection. The scientists are now looking for further cofactors and other possibilities of identifying the risk of microcephaly early on.

Background

Felix Drexler and his research group have already developed several novel Zika virus tests. The Zika diagnostics project in Brazil was brought underway by the DZIF in order to act against the threat of emerging infections. It is also being funded by the EU programme Horizon 2020.

Zika and dengue viruses

Zika viruses are usually transmitted by mosquitoes, particularly by the Aedes species, but they can also be transmitted sexually. Symptoms of Zika include rashes, headaches, joint pain and muscle pain, conjunctivitis and sometimes fever.

However, these symptoms are considered mild compared to other tropical diseases that are transmitted by mosquitoes. During pregnancy, the virus can cause microcephaly and other malformations in the unborn child.

The dengue virus is also transmitted by mosquitoes of the Aedes species and has similar symptoms to Zika infection. Dengue usually causes high temperatures, headaches, muscle and joint pain. People usually recover within a few days, but complications may also occur. Dengue fever is one of the most common diseases transmitted by mosquitoes worldwide.

florida80 05-18-2019 18:16

Wearable cooling and heating patch could serve as personal thermostat and save energy


University of California - San Diego


    Share

 Print  E-Mail


IMAGE


IMAGE: Prototype of the cooling and heating patch embedded in a mesh armband. view more 

Credit: David Baillot/UC San Diego Jacobs School of Engineering


Engineers at the University of California San Diego have developed a wearable patch that could provide personalized cooling and heating at home, work, or on the go. The soft, stretchy patch cools or warms a user's skin to a comfortable temperature and keeps it there as the ambient temperature changes. It is powered by a flexible, stretchable battery pack and can be embedded in clothing. Researchers say wearing it could help save energy on air conditioning and heating.

The work is published May 17 in the journal Science Advances.

"This type of device can improve your personal thermal comfort whether you are commuting on a hot day or feeling too cold in your office," said Renkun Chen, a professor of mechanical and aerospace engineering at UC San Diego who led the study.

The device, which is at the proof-of-concept stage, could also save energy. "If wearing this device can make you feel comfortable within a wider temperature range, you won't need to turn down the thermostat as much in the summer or crank up the heat as much in the winter," Chen said. Keeping a building's set temperature 12 degrees higher during the summer, for example, could cut cooling costs by about 70 percent, he noted.

There are a variety of personal cooling and heating devices on the market, but they are not the most convenient to wear or carry around. Some use a fan, and some need to be soaked or filled with fluid such as water.

Chen and a team of researchers at the UC San Diego Jacobs School of Engineering designed their device to be comfortable and convenient to wear. It's flexible, lightweight and can be easily integrated into clothing.

The patch is made of thermoelectric alloys--materials that use electricity to create a temperature difference and vice versa--sandwiched between stretchy elastomer sheets. The device physically cools or heats the skin to a temperature that the wearer chooses.

"You could place this on spots that tend to warm up or cool down faster than the rest of the body, such as the back, neck, feet or arms, in order to stay comfortable when it gets too hot or cold," said first author Sahngki Hong, a UC San Diego mechanical engineering alumnus who worked on the project as a PhD student in Chen's lab.

The researchers embedded a prototype of the patch into a mesh armband and tested it on a male subject. Tests were performed in a temperature-controlled environment. In two minutes, the patch cooled the tester's skin to a set temperature of 89.6 degrees Fahrenheit. It kept the tester's skin at that temperature as the ambient temperature was varied between 71.6 and 96.8 degrees Fahrenheit.

A building block for smart clothing

The ultimate goal is to combine multiple patches together to create smart clothing that can be worn for personalized cooling and heating. So engineers designed a soft electronic patch that can stretch, bend and twist without compromising its electronic function.

The work is a collaboration between several research groups at the UC San Diego Jacobs School of Engineering. Chen's lab, which specializes in heat transfer technology, led the study. They teamed up with nanoengineering professors Sheng Xu, an expert in stretchable electronics, Shirley Meng, an expert in battery technology, Ping Liu, who is also a battery expert, and Joseph Wang, a wearable sensors expert.

The researchers built the patch by taking small pillars of thermoelectric materials (made of bismuth telluride alloys), soldering them to thin copper electrode strips, and sandwiching them between two elastomer sheets.

The sheets are specially engineered to conduct heat while being soft and stretchy. Researchers created the sheets by mixing a rubber material called Ecoflex with aluminum nitride powder, a material with high thermal conductivity.

The patch uses an electric current to move heat from one elastomer sheet to the other. As the current flows across the bismuth telluride pillars, it drives heat along with it, causing one side of the patch to heat up and the other to cool down.

"To do cooling, we have the current pump heat from the skin side to the layer facing outside," Chen explained. "To do heating, we just reverse the current so heat pumps in the other direction."

The patch is powered by a flexible battery pack. It is made of an array of coin cells all connected by spring-shaped copper wires and embedded in a stretchable material.

Saving energy

One patch measures 5 × 5 centimeters in size and uses up to 0.2 watts worth of power. Chen's team estimates that it would take 144 patches to create a cooling vest. This would use about 26 watts total to keep an individual cool on an average hot day (during extreme heat, estimated power use would climb up to 80 watts, which is about how much a laptop uses). By comparison, a conventional air conditioning system uses tens of kilowatts to cool down an entire office.

It's more energy-efficient to cool down an individual person than a large room, researchers noted. "If there are just a handful of occupants in that room, you are essentially consuming thousands of watts per person for cooling. A device like the patch could drastically cut down on cooling bills," Chen said.

The team is now working on patches that could be built into a prototype cooling and heating vest. They hope to commercialize the technology in a few years.

"We've solved the fundamental problems, now we're tackling the big engineering issues--the electronics, hardware, and developing a mobile app to control the temperature," Chen said.

###
Paper title: "Wearable Thermoelectrics for Personalized Thermoregulation." Co-authors include Yue Gu and Joon Kyo Seo.

This work is supported by the Advanced Research Project Agency - Energy (ARPA-E, grant DE-AR0000535) and UC San Diego startup funds.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through

florida80 05-18-2019 18:17

Earliest evidence of the cooking and eating of starch

Early human beings who lived around 120,000 years ago in South Africa were 'ecological geniuses' who were able to exploit their environment intelligently for suitable food and medicines

University of the Witwatersrand


    Share

 Print  E-Mail


IMAGE


IMAGE: The Klasies River cave in the southern Cape of South Africa. view more 

Credit: Wits University


New discoveries made at the Klasies River Cave in South Africa's southern Cape, where charred food remains from hearths were found, provide the first archaeological evidence that anatomically modern humans were roasting and eating plant starches, such as those from tubers and rhizomes, as early as 120,000 years ago.

The new research by an international team of archaeologists, published in the Journal of Human Evolution, provides archaeological evidence that has previously been lacking to support the hypothesis that the duplication of the starch digestion genes is an adaptive response to an increased starch diet.

"This is very exciting. The genetic and biological evidence previously suggested that early humans would have been eating starches, but this research had not been done before," says Lead author Cynthia Larbey of the Department of Archaeology at the University of Cambridge. The work is part of a systemic multidisciplinary investigation into the role that plants and fire played in the lives of Middle Stone Age communities.

The interdisciplinary team searched for and analysed undisturbed hearths at the Klasies River archaeological site.

"Our results showed that these small ashy hearths were used for cooking food and starchy roots and tubers were clearly part of their diet, from the earliest levels at around 120,000 years ago through to 65,000 years ago," says Larbey. "Despite changes in hunting strategies and stone tool technologies, they were still cooking roots and tubers."

Professor Sarah Wurz from the School of Geography, Archaeology and Environmental Studies at the University of the Witwatersrand in Johannesburg, South Africa (Wits University) and principal investigator of the site says the research shows that "early human beings followed a balanced diet and that they were ecological geniuses, able to exploit their environments intelligently for suitable foods and perhaps medicines".

By combining cooked roots and tubers as a staple with protein and fats from shellfish, fish, small and large fauna, these communities were able to optimally adapt to their environment, indicating great ecological intelligence as early as 120 000 years ago.

"Starch diet isn't something that happens when we started farming, but rather, is as old as humans themselves," says Larbey. Farming in Africa only started in the last 10 000 years of human existence.

Humans living in South Africa 120 000 years ago formed and lived in small bands.

"Evidence from Klasies River, where several human skull fragments and two maxillary fragments dating 120 000 years ago occur, show that humans living in that time period looked like modern humans of today. However, they were somewhat more robust," says Wurz.

Klasies River is a very famous early human occupation site on the Cape coast of South Africa excavated by Wurz, who, along with Susan Mentzer of the Senckenberg Institute and Eberhard Karls Universit?t Tübingen, investigated the small (c. 30cm in diameter) hearths.

###

The research to look for the plant materials in the hearths was inspired by Prof Hilary Deacon, who passed on the Directorship of the Klasies River site on to Wurz. Deacon has done groundbreaking work at the site and in the 1990's pointed out that there would be plant material in and around the hearths. However, at the time, the micro methods were not available to test this hypothesis.

florida80 05-18-2019 18:17

'Imagine...' -- our attitudes can change solely by the power of imagination

Roland Benoit and Philipp Paulus together show that our attitudes can be influenced not only by what we actually experience but also by what we imagine

Max Planck Institute for Human Cognitive and Brain Sciences


    Share

 Print  E-Mail


Sometimes in life there are special places that seem to stand out to us - a school playground, perhaps an old church, or that inconspicuous street corner where you were kissed for the first time. Before the kiss you had never even noticed that corner. It's as if the special experience with that beloved person transferred positive emotion to the location. Our attitude towards these places thus suddenly changes - they become valuable to us. But could this also happen purely by the power of imagination rather than by actual experiences? Roland Benoit and Philipp Paulus from the Max Planck Institute for Human Cognitive and Brain Sciences, together with Daniel Schacter from Harvard University, have examined this question in a study published in the journal Nature Communications. They show that our attitudes can be influenced not only by what we actually experience but also by what we imagine. Furthermore, they believe the phenomenon is based on activity in a particular location in the front of our brains, the ventromedial prefrontal cortex.

Participants in their study were first asked to name people that they like very much and also people they don't like at all. In addition, they were asked to provide a list of places that they considered to be neutral. Later, when the participants were lying in the MRI scanner, they were asked to vividly imagine how they would spend time with a much-liked person at one of the neutral places. "So I might imagine myself with my daughter in the elevator of our institute, where she wildly pushes all the buttons. Eventually, we arrive at the roof top terrace, where we get out to enjoy the view," describes first author Roland Benoit, who heads the research group 'Adaptive Memory'.

After the MRI scanning, he and his colleagues were able to determine that the attitudes of the participants towards the places had changed: the previously neutral places that had been imagined with liked people were now regarded more positive than at the beginning of the study. The authors first observed this effect with study participants in Cambridge, MA, and then successfully replicated this effect in Leipzig, Germany. "Merely imagining interacting with a much-liked person at a neutral place can transfer the emotional value of the person to this place. And we don't even have to actually experience the episode in reality," is how co-author Daniel Schacter sums it up.

Using MRI data, the researchers were able to show how this mechanism works in the brain. The ventromedial prefrontal cortex plays an important role in this process. This is where information about individual persons and places from our environment is stored, as the authors assumed. But this region also evaluates how important individual people and places are for us. "We propose that this region bundles together representations of our environment by binding together information from the entire brain that form an overall picture," Roland Benoit explains.

"For example, there would be a representation with information about my daughter - what she looks like, how her voice sounds, how she reacts in certain situations. The idea now is that these representations also include an evaluation - for example, how important my daughter is to me and how much I love her."

Indeed, when the participants thought of a person that they liked more strongly, the scientists saw signs of greater activity in that region. "Now, when I imagine my daughter in the elevator, both her representation and that of the elevator become active in the ventromedial prefrontal cortex. This, in turn, can connect these representations - the positive value of the person can thus transfer to the previously neutral location."

Why are the researchers interested in this phenomenon? They want to better understand the human ability to experience hypothetical events through imagination and how we learn from imagined events much in the same way as from actual experiences. This mechanism can potentially augment future-oriented decisions and also help avoiding risks. According to Benoit, it will be important to also understand the consequences of negative thoughts: "In our study, we show how positive imaginings can lead to a more positive evaluation of our environment. I wonder how this mechanism influences people who tend to dwell on negative thoughts about their future, such as people who suffer from depression. Does such rumination lead to a devaluation of aspects of their life that are actually neutral or even positive?" This could be the next interesting research question for his team.

florida80 05-18-2019 18:18

Study finds narrowing gender gap in youth suicides

Recent data show a disproportionate increase in the suicide rate among female relative to male youth

Nationwide Children's Hospital


    Share

 Print  E-Mail


New research from Nationwide Children's Hospital finds a disproportionate increase in youth suicide rates for females relative to males, particularly in younger youth aged 10-14 years. The report, which describes youth suicide trends in the United States from 1975 to 2016, appears this week in JAMA Network Open.

Suicide is the second leading cause of death among youth aged 10-19 years in the U.S., with rates historically higher in males than females. However, recent reports from the Centers for Disease Control and Prevention reveal female youth are experiencing a greater percent increase in suicide rates compared to males.

Donna Ruch, PhD, a post-doctoral researcher in the Center for Suicide Prevention and Research at the Research Institute at Nationwide Children's Hospital, examined these trends by investigating suicide rates among U.S. youth aged 10-19 years from 1975 through 2016.

The researchers found that following a downward trend in suicide rates for both sexes in the early 1990s, suicide rates increased for both sexes since 2007, but suicide rates for females increased more. There was a significant and disproportionate increase in suicide rates for females relative to males, with the largest percentage increase in younger females. These trends were observed across all regions of the country.

"Overall, we found a disproportionate increase in female youth suicide rates compared to males, resulting in a narrowing of the gap between male and female suicide rates," said Dr. Ruch.

When the researchers looked at the data by method, they found the rates of female suicides by hanging or suffocation are approaching those of males. This is especially troubling in light of the gender paradox in suicidal behavior, according to Jeff Bridge, PhD, director of the Center for Suicide Prevention and Research at Nationwide Children's and co-author of the new study.

Dr. Bridge said females have higher rates of non-fatal suicidal behavior, such as thinking about and attempting suicide, but more males die by suicide than females.

"One of the potential contributors to this gender paradox is that males tend to use more violent means, such as guns or hanging," said Dr. Bridge. "That makes the narrowing of the gender gap in suicide by hanging or suffocation that we found especially concerning from a public health perspective."

The researchers called for future work to examine whether there are gender-specific risk factors that have changed in recent years and how these determinants can inform intervention.

"From a public health perspective, in terms of suicide prevention strategies, our findings reiterate the importance of not only addressing developmental needs but also taking gender into account," said Dr. Ruch.

Dr. Bridge emphasized that asking children directly about suicide will not trigger subsequent suicidal thinking or behavior.

"Parents need to be aware of the warning signs of suicide, which include a child making suicidal statements, being unhappy for an extended period, withdrawing from friends or school activities or being increasingly aggressive or irritable," he said. "If parents observe these warning signs in their child, they should consider taking the child to see a mental health professional."

florida80 05-18-2019 18:19

Study may help prevent relapse in cocaine use disorder patients

Brazilian researchers combined cognitive dysfunction tests with an analysis of drug use patterns to identify patients at high risk of relapse after treatment

Fundação de Amparo à Pesquisa do Estado de São Paulo


    Share

 Print  E-Mail


A study conducted at the University of São Paulo (USP) in Brazil and described in an article published in the journal Drug and Alcohol Dependence may help healthcare workers identify patients who risk relapse after undergoing treatment for cocaine use disorder.

According to the authors, the findings reinforce the need to offer individualized treatment strategies for these cases, which are considered severe.

The principal investigator for the study was Paulo Jannuzzi Cunha, a professor at the University of São Paulo's Medical School (FM-USP), with a postdoctoral research scholarship from São Paulo Research Foundation - FAPESP and support from the National Council for Scientific and Technological Development (CNPq).

For 30 days, the researchers monitored 68 patients admitted for treatment for cocaine dependence to the Psychiatry Institute of Hospital das Clínicas, FM-USP's general and teaching hospital in São Paulo. All patients volunteered to participate in the study. They were contacted three months after discharge to verify their abstinence. All but 14 reported a relapse, defined as at least one episode of cocaine use during the period.

One of the goals of the study was to determine whether the 11 criteria for diagnosing chemical dependence established in the DSM-5 were also effective at predicting the response to treatment. DSM-5 is the latest edition of the Diagnostic and Statistical Manual of Mental Disorders, published by the American Psychiatric Association (APA) and valued as the standard classification by mental health professionals in many parts of the world.

"Our hypothesis was that these criteria would not help accurately predict relapse. However, after completing the study, we recognized that they may in fact be useful for predicting vulnerability to relapse," Cunha told.

The DSM-5 criteria for diagnosing a substance use disorder include taking the drug in larger amounts and/or for longer periods than intended; craving, or a strong desire to use the drug; giving up social, occupational or recreational activities because of substance use; continued use despite having persistent social or interpersonal problems caused or exacerbated by the effects of the drug; tolerance (needing increasing amounts to achieve the desired effect); and withdrawal symptoms, among others.

According to the DSM-5 guidelines, a substance use disorder can be considered mild if two or three of the 11 criteria are met for a year, moderate if four or five are met for a year, or severe if six or more are met for a year.

"Our sample included only cases classified as severe. We observed a clear difference between patients who met six to eight criteria and those who met nine to 11. The relapse rate was significantly higher in the latter group," said Danielle Ruiz Lima, first author of the article.

The results suggested that the three categories recommended by DSM-5 should be reviewed. "There seems to be an 'ultra-severe' group as well as a 'severe' group," they said.

Refining the analysis

Another hypothesis investigated was that the pattern of cocaine use (DSM-5 score plus factors such as age at use onset and intensity of use in the month prior to hospital admission) and the cognitive deficit caused by the drug were related variables that could help predict a relapse after treatment.

The researchers used several tests to assess the participants' performance in executive functioning, which include working memory (required for specific actions, such as those of a waiter who has to associate orders with customers and deliver them correctly), sustained attention (the ability to focus for as long as it takes to complete a specific task, such as filling out a form, without being distracted), and inhibitory control (the capacity to control impulses).

The researchers applied the tests a week after admission on average, as they considered this waiting time sufficient to know whether the urine toxicology was negative. The aim of this procedure was to avoid acute effects of the drug on the organism.

One of the tests used to measure memory span required the patient to repeat an ascending sequence of numbers presented by the researchers. Patients who had made intensive use of cocaine in the previous month before admission underperformed on this test.

In other tests, in this case used to measure selective attention, cognitive flexibility and inhibitory control, patients were asked to repeat a sequence of colors and then to name the ink color when the name of a different color is printed (the word "yellow" printed in blue ink, for example).

"The automatic response is to read what's written instead of naming the ink color, as required. Inhibitory control is essential to perform this task. This function is extremely important in the early stages of drug dependence rehabilitation, when the patient has to deal with craving and situations that stimulate the desire to use the drug," Cunha said.

The analysis showed a correlation between inhibitory control test results and the age at cocaine use onset. "The earlier the onset, the more mistakes they made, which could point to a higher risk of relapse. The idea is to identify the people who have substantial difficulties so that we can work out individualized treatment strategies," Lima said.

Intense cocaine use in the 30 days preceding admission also correlated with underperformance on the inhibitory control test and the working memory test, in which the patient was asked to listen to a sequence of numbers and repeat them in reverse order, reflecting an important function for the manipulation of information as a basis for decision making.

According to the researchers, studies have shown that executive functions may be restored after a period of abstinence, but to what extent and how long this recovery may take are unknown.

The research group at FM-USP advocates cognitive rehabilitation programs to support the recovery process. "We have a motivational chess proposal that's being studied right now," Cunha said. "A therapist plays chess with the patient and then discusses the moves, making analogies with the patient's life in an attempt to transfer the knowledge acquired in the chess game to day-to-day experience and train inhibitory control, planning and healthy decision making in the real world."

In his view, the assessment of cognitive deficits is important for both diagnosing substance use disorder and predicting relapse. "Chemical dependence is a brain disease, and these neuropsychological tests offer a slide rule with which to measure objectively whether the damage is mild, moderate or severe, along similar lines to the classification of dementia," he said.

According to the researchers, despite the clinical relevance of neuropsychological test results, they are not part of the DSM-5 criteria, which are based solely on self-reported factors and clinical observations.

"We hope the sixth edition of the DSM takes these developments into account in recognition of the research done by us and other groups worldwide," Cunha said

florida80 05-18-2019 18:20

News Release 16-May-2019

How we make complex decisions

Neuroscientists identify a brain circuit that helps break decisions down into smaller pieces

Massachusetts Institute of Technology


    Share

 Print  E-Mail


CAMBRIDGE, MA -- When making a complex decision, we often break the problem down into a series of smaller decisions. For example, when deciding how to treat a patient, a doctor may go through a hierarchy of steps -- choosing a diagnostic test, interpreting the results, and then prescribing a medication.

Making hierarchical decisions is straightforward when the sequence of choices leads to the desired outcome. But when the result is unfavorable, it can be tough to decipher what went wrong. For example, if a patient doesn't improve after treatment, there are many possible reasons why: Maybe the diagnostic test is accurate only 75 percent of the time, or perhaps the medication only works for 50 percent of the patients. To decide what do to next, the doctor must take these probabilities into account.

In a new study, MIT neuroscientists explored how the brain reasons about probable causes of failure after a hierarchy of decisions. They discovered that the brain performs two computations using a distributed network of areas in the frontal cortex. First, the brain computes confidence over the outcome of each decision to figure out the most likely cause of a failure, and second, when it is not easy to discern the cause, the brain makes additional attempts to gain more confidence.

"Creating a hierarchy in one's mind and navigating that hierarchy while reasoning about outcomes is one of the exciting frontiers of cognitive neuroscience," says Mehrdad Jazayeri, the Robert A. Swanson Career Development Professor of Life Sciences, a member of MIT's McGovern Institute for Brain Research, and the senior author of the study.

MIT graduate student Morteza Sarafyzad is the lead author of the paper, which appears in Science on May 16.

Hierarchical reasoning

Previous studies of decision-making in animal models have focused on relatively simple tasks. One line of research has focused on how the brain makes rapid decisions by evaluating momentary evidence. For example, a large body of work has characterized the neural substrates and mechanisms that allow animals to categorize unreliable stimuli on a trial-by-trial basis. Other research has focused on how the brain chooses among multiple options by relying on previous outcomes across multiple trials.

"These have been very fruitful lines of work," Jazayeri says. "However, they really are the tip of the iceberg of what humans do when they make decisions. As soon as you put yourself in any real decision-making situation, be it choosing a partner, choosing a car, deciding whether to take this drug or not, these become really complicated decisions. Oftentimes there are many factors that influence the decision, and those factors can operate at different timescales."

The MIT team devised a behavioral task that allowed them to study how the brain processes information at multiple timescales to make decisions. The basic design was that animals would make one of two eye movements depending on whether the time interval between two flashes of light was shorter or longer than 850 milliseconds.

A twist required the animals to solve the task through hierarchical reasoning: The rule that determined which of the two eye movements had to be made switched covertly after 10 to 28 trials. Therefore, to receive reward, the animals had to choose the correct rule, and then make the correct eye movement depending on the rule and interval. However, because the animals were not instructed about the rule switches, they could not straightforwardly determine whether an error was caused because they chose the wrong rule or because they misjudged the interval.

The researchers used this experimental design to probe the computational principles and neural mechanisms that support hierarchical reasoning. Theory and behavioral experiments in humans suggest that reasoning about the potential causes of errors depends in large part on the brain's ability to measure the degree of confidence in each step of the process. "One of the things that is thought to be critical for hierarchical reasoning is to have some level of confidence about how likely it is that different nodes [of a hierarchy] could have led to the negative outcome," Jazayeri says.

The researchers were able to study the effect of confidence by adjusting the difficulty of the task. In some trials, the interval between the two flashes was much shorter or longer than 850 milliseconds. These trials were relatively easy and afforded a high degree of confidence. In other trials, the animals were less confident in their judgments because the interval was closer to the boundary and difficult to discriminate.

As they had hypothesized, the researchers found that the animals' behavior was influenced by their confidence in their performance. When the interval was easy to judge, the animals were much quicker to switch to the other rule when they found out they were wrong. When the interval was harder to judge, the animals were less confident in their performance and applied the same rule a few more times before switching.

"They know that they're not confident, and they know that if they're not confident, it's not necessarily the case that the rule has changed. They know they might have made a mistake [in their interval judgment]," Jazayeri says.

Decision-making circuit

By recording neural activity in the frontal cortex just after each trial was finished, the researchers were able to identify two regions that are key to hierarchical decision-making. They found that both of these regions, known as the anterior cingulate cortex (ACC) and dorsomedial frontal cortex (DMFC), became active after the animals were informed about an incorrect response. When the researchers analyzed the neural activity in relation to the animals' behavior, it became clear that neurons in both areas signaled the animals' belief about a possible rule switch. Notably, the activity related to animals' belief was "louder" when animals made a mistake after an easy trial, and after consecutive mistakes.

The researchers also found that while these areas showed similar patterns of activity, it was activity in the ACC in particular that predicted when the animal would switch rules, suggesting that ACC plays a central role in switching decision strategies. Indeed, the researchers found that direct manipulation of neural activity in ACC was sufficient to interfere with the animals' rational behavior.

"There exists a distributed circuit in the frontal cortex involving these two areas, and they seem to be hierarchically organized, just like the task would demand," Jazayeri says.

florida80 05-18-2019 18:21

Human capital benefits of military boost economy by billions


North Carolina State University


    Share

 Print  E-Mail


A recent study from North Carolina State University finds that U.S. government spending on military personnel has a positive impact on the nation's human capital - essentially improving the American workforce. Using a new computer model, the study estimates the economic impact of this human capital improvement to be $89.8 billion for 2019 alone.

"Previous efforts to estimate the economic impact of military spending have viewed that spending collectively, whether the money was spent on buying new planes or training personnel," says Bruce McDonald, author of a paper on the study and an associate professor of public budgeting and finance in NC State's School of Public and International Affairs. "I was able to obtain spending records that detailed precisely how the military was spending its money, and that allowed me to tease out the effect of military spending on human capital and what that means for the U.S. economy."

McDonald used year-by-year expense report data for all Defense Department spending from 1949 through 2014 to develop a computational economic model. The model was able to estimate the overall economic impact of military spending, as well as what portion of that impact comes from human capital investment versus spending on material goods.

"Basically, the model shows that 18.9 percent of annual economic growth in the U.S. can be attributed to human capital investments made by the military," McDonald says. "This likely reflects the scope of the military's personnel training and education efforts - from entry level training to training in technical specialties to sending personnel to medical school."

Economic growth is measured by year-to-year growth of the GDP, which comes to $475.5 billion for 2019. And, since 18.9 percent of $475.5 billion is $89.8 billion, that's how much can be attributed this year to military spending on human capital.

"Understanding these complex economic relationships is important for ensuring that we are making informed decisions about both the size of the military and how we fund it," McDonald says.

florida80 05-19-2019 17:44

News Release 19-May-2019

Researchers document impact of coffee on bowels



Rat study shows coffee changes gut microbiome and improves ability of intestines to contract

Digestive Disease Week


    Share

 Print  E-Mail


San Diego, Calif. (May 19, 2019) -- Coffee drinkers know that coffee helps keep the bowels moving, but researchers in Texas are trying to find out exactly why this is true, and it doesn't seem to be about the caffeine, according to a study presented at Digestive Disease Week® (DDW) 2019. Researchers, feeding rats coffee and also mixing it with gut bacteria in petri dishes, found that coffee suppressed bacteria and increased muscle motility, regardless of caffeine content.

"When rats were treated with coffee for three days, the ability of the muscles in the small intestine to contract appeared to increase," said Xuan-Zheng Shi, PhD, lead author of the study and associate professor in internal medicine at the University of Texas Medical Branch, Galveston. "Interestingly, these effects are caffeine-independent, because caffeine-free coffee had similar effects as regular coffee."

Coffee has long been known to increase bowel movement, but researchers have not pinpointed the specific reason or mechanism. Researchers examined changes to bacteria when fecal matter was exposed to coffee in a petri dish, and by studying the composition of feces after rats ingested differing concentrations of coffee over three days. The study also documented changes to smooth muscles in the intestine and colon, and the response of those muscles when exposed directly to coffee.

The study found that growth of bacteria and other microbes in fecal matter in a petri dish was suppressed with a solution of 1.5 percent coffee, and growth of microbes was even lower with a 3 percent solution of coffee. Decaffeinated coffee had a similar effect on the microbiome.

After the rats were fed coffee for three days, the overall bacteria counts in their feces were decreased, but researchers said more research is needed to determine whether these changes favor firmicutes, considered "good" bacteria, or enterobacteria, which are regarded as negative.

Muscles in the lower intestines and colons of the rats showed increased ability to contract after a period of coffee ingestion, and coffee stimulated contractions of the small intestine and colon when muscle tissues were exposed to coffee directly in the lab.

The results support the need for additional clinical research to determine whether coffee drinking might be an effective treatment for post-operative constipation, or ileus, in which the intestines quit working after abdominal surgery, the authors said.

###

florida80 05-19-2019 17:45

Walking and strength training may decrease the risk of dying from liver disease

Study of exercise habits of 117,000 people over 26 years assesses risk factors

Digestive Disease Week


   


San Diego, Calif. (May 19, 2019) -- Physical activity, including walking and muscle-strengthening activities, were associated with significantly reduced risk of cirrhosis-related death, according to research presented at Digestive Disease Week® (DDW) 2019. Chronic liver disease is increasing, partly due to the obesity epidemic, and currently there are no guidelines for the optimal type of exercise for the prevention of cirrhosis-related mortality. Researchers hope these findings will help provide specific exercise recommendations for patients at risk for cirrhosis and its complications.

"The benefit of exercise is not a new concept, but the impact of exercise on mortality from cirrhosis and from liver cancer has not yet been explored on this scale," said Tracey Simon, MD, lead researcher on the study and instructor of medicine at Harvard Medical School and Massachusetts General Hospital, Boston. "Our findings show that both walking and strength training contribute to substantial reductions in risk of cirrhosis-related death, which is significant because we know very little about modifiable risk factors."

Dr. Simon and her team prospectively followed 68,449 women from the Nurses' Health Study and 48,748 men from the Health Professionals Follow-up Study, without known liver disease at baseline. Participants provided highly accurate data on physical activity, including type and intensity, every two years from 1986 through 2012, which allowed researchers to prospectively examine the association between physical activity and cirrhosis-related death.

Researchers observed that adults in the highest quintile of weekly walking activity had 73 percent lower risk for cirrhosis-related death than those in the lowest quintile. Further risk reduction was observed with combined walking and muscle-strengthening exercises.

Previous research has been limited to studies that assessed physical activity at just one point in time, or studies with very short-term follow-up. This was the first prospective study in a large U.S. population to include detailed and updated measurements of physical activity over such a prolonged period, which allowed researchers to more precisely estimate the relationship between physical activity and liver-related outcomes.

"In the U.S., mortality due to cirrhosis is increasing dramatically, with rates expected to triple by the year 2030. In the face of this alarming trend, information on modifiable risk factors that might prevent liver disease is needed," said Dr. Simon. "Our findings support further research to define the optimal type and intensity of physical activity to prevent adverse outcomes in patients at risk for cirrhosis."

###

florida80 05-19-2019 17:46

News Release 19-May-2019

Many with mild asthma, low sputum eosinophils respond equally well to steroids as placebo

'Low eosinophil' biomarkers found in nearly three-quarters of people with mild asthma

NIH/National Heart, Lung and Blood Institute

florida80 05-19-2019 17:47

News Release 19-May-2019

Growth in life expectancy in Australia slows, research finds


University of Melbourne


 


After 20 years of rapid increases in life expectancy at birth, the rate of growth in Australia is now falling behind most other high-income nations, meaning better control of health risk factors such as obesity will be needed if further life expectancy increases are to be achieved, research shows.

Published today by the Medical Journal of Australia, researchers from the University of Melbourne School of Population and Global Health have analysed data for Australia and 26 other high-income countries from 1980-2016.

Researchers found from 1981 to 2003, life expectancy at birth increased rapidly in Australia, both in absolute terms, and in comparison, with other high-income countries.

For males, the difference in life expectancy between Australia and the other 26 countries increased from +0.7 years to +2.3 years over this period. For females, the difference in life expectancy increased from +0.9 years to +1.3 years.

University of Melbourne Rowden-White Chair of Global Health and Burden of Disease Measurement Alan Lopez said the main contributor to greater increases in life expectancy for males in Australia than in western Europe was lower mortality from ischaemic heart disease.

"Compared with the United States, mortality from ischaemic heart disease, cerebrovascular disease, and transport-related injuries was lower," Laureate Professor Lopez said.

Since 2003, researchers found life expectancy has increased more slowly for both sexes than in most other high-income countries, mainly because declines in mortality from cardiovascular disease and cancer have slowed. For males, it was +2.3 years in 2015, and females, +1.1 years.

"Together with the high prevalence of obesity, this suggests that future life expectancy increases will be smaller than in other high-income countries," Professor Lopez said.

University of Melbourne researcher and co-author Tim Adair said this slowing should concern public health policy makers.

"Life expectancy in Australia is among the highest in the world, a testament to boldly progressive public health interventions over several decades," Dr Adair said.

"However, there are several major barriers to marked increases, including the notably higher mortality of more recent birth cohorts and the comparative failure of efforts to reduce levels of overweight and obesity.

"Other high-income countries have greater scope for reducing the prevalence of smoking. As a result, our high global ranking with regard to life expectancy at birth is unlikely to be maintained unless new strategies for reducing mortality associated with specific behaviours are developed and deployed effectively."

florida80 05-19-2019 17:48

News Release 19-May-2019

New risk scores help physicians provide better care for high-risk pulmonary patients, study finds


Intermountain Medical Center


   




IMAGE: A new laboratory-based method of estimating outcomes for patients with a severe pulmonary disorder that has no cure may help physicians better provide proper care, referrals, and services for patients... view more 

Credit: Intermountain Healthcare

A new laboratory-based method of estimating outcomes for patients with a severe pulmonary disorder that has no cure can help physicians better provide proper care, referrals, and services for patients at the end of life, according to a new study of more than 17,000 patients from Intermountain Healthcare.

The Laboratory-based Intermountain Validated Exacerbation (LIVE) score is a prediction model that predicts all-cause mortality, morbidity, and hospitalization rates for patients with chronic obstructive pulmonary disease (COPD), a chronic, progressive lung disease that gradually makes it hard to breathe. COPD affects roughly 16 million Americans, or just under five percent of the U.S. population. It's estimated that millions more have the disease, but are undiagnosed.

The LIVE score combines a patient's simple laboratory values (levels of hemoglobin, albumin, creatinine, chloride, and potassium) to identify patients who are at high risk of death or further disease advancement, and who may most need referrals to palliative care and advanced care planning resources.

In the study, Intermountain Healthcare researchers calculated the LIVE scores of 17,124 patients with COPD from the Kaiser Health System Northwest Region. They found that patients with high-risk LIVE scores had the highest one-year mortality rates (39.4%) and the highest rate of palliative care referrals (41.7%). In comparison, patients with the lowest risk LIVE scores had 0.7% all cause one-year mortality and 0.7% palliative care referral rate.

"We found the LIVE score helps personalize therapy to patients beyond the COPD diagnosis alone and provides additional risk information to both patients and their doctors. From a population health perspective, the LIVE score allows for designing pathways of care that identify and treat patients based on individual risk beyond a single diagnosis label alone," said Denitza Blagev, MD, the study's lead author, and a pulmonary and critical care physician at Intermountain Medical Center, who serves as medical director for Quality, Specialty Based Care at Intermountain Healthcare in Salt Lake City.

Results from the study will be presented at the annual international conference of the American Thoracic Society, in Dallas on May 19.

Unlike other COPD risk scores, the LIVE score is entirely based on blood tests and assesses rates of other diseases in COPD patients rather than lung function specifically. While diseases such as heart disease and kidney disease contribute to the risk of death, hospitalization, and symptoms in patients with COPD, there has been no systematic way of incorporating these diseases in determining overall risk for patients with COPD, until now.

Researchers say the findings can help physicians determine which of their COPD patients are at highest risk, and who may benefit from palliative care and appropriate end-of-life services. Palliative care focuses on relief from the symptoms of a serious disease, rather than on a cure, and is often provided in the final stages of a patient's life.

While patients with COPD in general are considered high risk, there is a lot of variability in the risk of death for a particular patient with a COPD diagnosis. By using the LIVE score clinicians can design health system interventions that assess high-risk patients for palliative care evaluation.

"By exploring the association of palliative care referrals and LIVE score risk, this study is a step forward in understanding how the LIVE score may be used to target appropriate patient care," said Dr. Blagev.

"Our findings lend more insight into how we can use these laboratory-based scores at the bedside to ensure that patients are receiving the most appropriate care," she said. "This doesn't mean everyone with high risk needs to be referred to palliative care, but it shows potential opportunities to improve care for patients in that highest risk group," said Dr. Blagev.

For example, for a COPD patient with a low-risk LIVE score, interventions aimed at optimizing COPD management may be most effective, as the risk of other diseases and death is relatively low. In contrast, a patient with a high-risk LIVE score may see benefit from COPD-directed therapy, but may find even more improvement with management of their other diseases, which contribute to the risk of death.

Researchers note that the LIVE score model has already been validated in more than 100,000 COPD patients at several diverse health systems, so these new study findings further demonstrate the effectiveness of using the model to enhance care and planning for patients

florida80 05-19-2019 18:02

Bacterial pneumonia predicts ongoing lung problems in infants with acute respiratory FAI


American Thoracic Society


    Share

 Print  E-Mail


ATS 2019, Dallas, TX -- Bacterial pneumonia appears to be linked to ongoing breathing problems in previously healthy infants who were hospitalized in a pediatric intensive care unit for acute respiratory failure, according to research presented at ATS 2019. The researchers found that infants with bacterial pneumonia when they left the hospital were more likely to have lung problems that required supplemental oxygen, bronchodilators or steroids

florida80 05-19-2019 18:03

News Release 18-May-2019

Button batteries can rapidly damage stomach lining before symptoms appear

Experts recommend changing current practice of watchful waiting

Digestive Disease Week


   


San Diego, CA (May 18, 2019) -- Damage to the lining of the stomach can occur quickly when children swallow button batteries; therefore, clinicians should consider prompt endoscopic removal, even when the child is symptom free and the battery has passed safely through the narrow esophagus, according to research presented at Digestive Disease Week® (DDW) 2019. The recommendations represent a change from current practice of watching and waiting.

"We know there can be injury even when there are no symptoms," said Racha Khalaf, MD, lead researcher and pediatric gastroenterology, hepatology and nutrition fellow at the Digestive Health Institute at Children's Hospital Colorado, Aurora. "Batteries in the stomach cause damage, including perforation of the gastric wall, so physicians should consider removing the batteries as soon as possible and not let them pass through the digestive tract."

Researchers from pediatric hospitals in Colorado, Florida, Texas and Ohio collected data regarding 68 button battery ingestions from January 2014 to May 2018. Previous research has been conducted on button batteries lodged in the esophagus, but little is known about the effect in the stomach.

"We have been seeing more injuries from button batteries," Dr. Khalaf said. "The batteries come in toys, remote controls, key fobs, singing greeting cards and watches. They are everywhere."

Erosive injuries to the mucous lining of the stomach were found in 60 percent of cases reviewed, with no apparent relationship between damage and symptoms, or with the amount of time passed since ingestion. This suggests that clinicians and parents should not wait for symptoms or passage of time to act, Dr. Khalaf said, adding that removing the battery earlier avoids repeated trips to the emergency room or pediatrician's office and reduces repetitive x-rays or other imaging.

The authors' recommendations are more aggressive than those of two national organizations that have issued recommendations about button battery ingestion. The North American Society for Pediatric Gastroenterology, Hepatology and Nutrition recommends observation when it's been less than two hours since ingestion, the battery is 20 mm or smaller, and the child is at least 5 years old. The National Capital Poison Center, which runs the National Battery Ingestion Hotline, currently recommends observation alone for asymptomatic gastric button batteries to allow them to pass through the digestive system

florida80 05-19-2019 18:04

News Release 18-May-2019

Men ignore serious health risks of steroid abuse in pursuit of the body beautiful


European Society of Endocrinology


  


Many men continue to abuse steroids despite knowing that they have serious, life-limiting and potentially lethal side effects, according to findings to be presented in Lyon, at the European Society of Endocrinology annual meeting, ECE 2019. The study findings indicate that men using anabolic steroids to improve strength and physical performance are often aware of the side effects but choose to continue taking them. This raises serious concerns not only for their own health but that of future generations, since side effects are known to damage sperm as well as increase the risk of sexual dysfunction, heart disease and liver damage.

Anabolic steroids such as testosterone are performance-enhancing hormones that increase muscle mass and boost athletic ability, which has led to their misuse and abuse by some, and men in particular. However, the use of steroids has some life-limiting and serious side effects including reduced sperm count, erectile dysfunction, baldness, breast development and an increased risk of heart disease, stroke and liver or kidney failure. Despite this steroid misuse persists, a 2014 study estimated that worldwide 3.3% of the population or 6.4% of the male population are abusing steroids. Recent evidence has suggested that not only do steroids pose serious health risks to the individual but that they also cause damage to sperm, so could be harmful to their future children. To adequately tackle this health issue, it is necessary to establish whether men abusing steroids are fully aware of all the risks or if they are choosing to ignore them.

In this study, Dr Mykola Lykhonosov and colleagues from Pavlov First Saint Petersburg State Medical University in Russia, conducted an anonymous survey of men, who regularly attend the gym, to assess their knowledge of, use of and attitude towards the health risks of anabolic steroids. Of 550 respondents 30.4% said they used steroids, 74.3% of users were aged 22-35 years old and 70.2% of users said they were aware of the side effects. In addition, 54.8% of all respondents indicated that they would like to receive more expert information on steroids and their side effects.

Dr Lykhonosov says, "These findings were surprising, not only was the prevalence of steroid abuse high, knowledge of the damaging side effects was also high, yet this does not stop them taking them."

Dr Lykhonosov's now plans to investigate how to treat hormonal imbalances and disorders caused by steroid abuse. He also thinks that greater public awareness of steroid abuse and its health risks may help discourage users.

Dr Lykhonosov comments, "We need to tackle this growing public health problem, increasing awareness through the promotion of stories from former users, on how steroid abuse has negatively impacted on their health and lives, could be a good strong message to discourage abuse."

florida80 05-19-2019 18:05

News Release 18-May-2019

Breastfeeding reduces long-term risk of heart disease in mothers


European Society of Endocrinology


    Share

 Print  E-Mail


Women who breastfed their babies are less likely to develop heart disease later in life, according to findings to be presented in Lyon, at the European Society of Endocrinology annual meeting, ECE 2019. The study also suggests that the protective effect on heart health is increased in women who breastfed for longer periods of time. These findings provide further evidence for the long-term health benefits of breastfeeding and that women should be encouraged to do so when possible.

Breastfeeding has previously been shown to reduce the risk of postpartum depression and the risk of certain cancers in women. It has also been established that breastfeeding can help mothers to maintain a healthy body weight and regulate their blood sugar. These benefits are likely to be related to the higher levels of the hormone, prolactin, in breastfeeding mothers. More recently, studies have indicated that prolactin reduces the risk of diabetes, which is a major risk factor for cardiovascular disease. Cardiovascular disease is a leading cause of death among women worldwide but the long-term protective effects of breastfeeding on heart disease risk have not been adequately investigated.

In this study, Professor Irene Lambrinoudaki from the University of Athens and colleagues, measured markers of heart and blood vessel health in postmenopausal women, in relation to their history of breastfeeding. After adjusting for other cardiovascular health risk factors, including body weight, age, cholesterol levels and smoking habits, the data indicated that women who had breastfed had significantly lower levels of heart disease and heart disease risk indicators. This effect was even more significant in women that had breastfed for longer periods of time.

Prof Lambrinoudaki says, "These findings indicate that breastfeeding lowers the risk of heart disease in women. However, this is an association study only, we are now interested in looking at establishing the underlying causes of this protective effect."

Prof Lambrinoudaki comments, "If we can show causality for the protective effect, women will have one more reason to nurse their infants, beyond the already documented benefits of breastfeeding for short- and long term health of both them and their children."

Prof Lambrinoudaki's team are now investigating the molecular mechanisms of how prolactin affects blood sugar, which is a major risk factor for heart disease. This research could uncover new mechanisms to target in the prevention of heart disease for everyone, not just breastfeeding women.

florida80 05-19-2019 18:09

Clinical trial at IU School of Medicine improves treatment of genetic rickets


Indiana University


 






IMAGE: Erik Imel, MD view more 

Credit: Indiana University School of Medicine

A new study shows a drug developed in conjunction with investigators at Indiana University School of Medicine to alleviate symptoms of a rare musculoskeletal condition is significantly more effective than conventional therapies. The findings are published in Lancet.

X-linked hypophosphatemia, or XLH, is a phosphate-wasting disease that causes rickets and osteomalacia, or softening of the bones, and can cause short stature, bowed legs, dental abscesses and bone pain. This rare, genetic disease affects about 1 in every 20,000 people.

Researchers recruited 61 children between the ages of 1 and 12 at 16 centers around the world, including the U.S., Canada, the United Kingdom, Sweden, Australia, Japan and Korea. The children were randomly assigned to either receive Burosumab, a biweekly injection that was approved by the Food and Drug Administration in April 2018, or conventional therapies of taking oral phosphate and active vitamin D several times a day. The primary outcome was improvement in rickets on X-rays, as scored by radiologists that were unaware of which treatment group the participant was in.

The children were observed for 64 weeks, and by 40 weeks of treatment, researchers found 72 percent of the children who received Burosumab achieved substantial healing of rickets, while only 6 percent of those in the conventional therapy group saw substantial healing. Burosumab also led to greater improvements in leg deformities, growth, distance walked in a 6-minute test and serum phosphorus and active vitamin D levels.

"This is the first study comparing Burosumab head-to-head with conventional therapy," said lead investigator Erik Imel, MD, associate professor of medicine at IU School of Medicine. "We now know the magnitude of benefit from Burosumab over the prior approach with conventional therapy. This information is critical for doctors to make treatment decisions for patients with XLH."

Researchers plan to continue studying the long-term effects of Burosumab, including the effect treating children has on height outcomes as an adult and whether this treatment will decrease the need for surgeries to correct bowed legs.

Burosumab blocks a protein called fibroblast growth factor 23 that was originally discovered by investigators at Indiana University School of Medicine. Burosumab is marketed by Ultragenyx Pharmaceutical, Inc. in collaboration with Kyowa Hakko Kirin Co., Ltd. and its European subsidiary, Kyowa Kirin International PLC, under the brand name Crysvita.

###

florida80 05-19-2019 18:10

News Release 17-May-2019

IU researchers develop electric field-based dressing to help heal wound infections


Indiana University


    Share

 Print  E-Mail



IMAGE


IMAGE: Chandan Sen, PhD view more 

Credit: Indiana University School of Medicine

Researchers at Indiana University School of Medicine have found a way to charge up the fight against bacterial infections using electricity.

Work conducted in the laboratories of the Indiana Center for Regenerative Medicine and Engineering, Chandan Sen, PhD and Sashwati Roy, PhD has led to the development of a dressing that uses an electric field to disrupt biofilm infection. Their findings were recently published in the high-impact journal "Annals of Surgery."

Bacterial biofilms are thin, slimy films of bacteria that form on some wounds, including burns or post-surgical infections, as well as after a medical device, such as a catheter, is placed in the body. These bacteria generate their own electricity, using their own electric fields to communicate and form the biofilm, which makes them more hostile and difficult to treat. The Centers for Disease Control and Prevention estimates 65 percent of all infections are caused by bacteria with this biofilm phenotype, while the National Institutes of Health estimates that number is closer to 80 percent.

Researchers at IU School of Medicine are the first to study the practice of using an electric field-based dressing to treat biofilms rather than antibiotics. They discovered the dressing is not only successful in fighting the bacteria on its own, but when combined with other medications can make them even more effective. This discovery has the potential to create significant changes in the way physicians treat patients with bacterial infections which are resistant to antibiotics. The dressing can also help prevent new biofilm infections from forming in the future. The dressing electrochemically self-generates 1 volt of electricity upon contact with body fluids such as wound fluid or blood, which is not enough to hurt or electrocute the patient.

"This shows for the first time that bacterial biofilm can be disrupted by using an electroceutical dressing," said Chandan Sen, PhD, director of the Indiana Center for Regenerative Medicine and Engineering and associate vice president of research for the IU School of Medicine Department of Surgery. "This has implications across surgery as biofilm presence can lead to many complications in successful surgical outcomes. Such textile may be considered for serving as hospital fabric - a major source of hospital acquired infections"

Marketing of the dressing for burn care was recently approved by the Food and Drug Administration. The team is now studying the device's effectiveness in patients recovering from burns.

florida80 05-19-2019 18:11

Children in Quebec are not diagnosed early enough with type 1 diabetes

A study reveals a rise in the number of children presenting a life-threatening complication at diagnosis

McGill University Health Centre


    Share

 Print  E-Mail


IMAGE


IMAGE: Dr. Meranda Nakhla, senior author of the study, is a pediatric endocrinologist at the Montreal Children's Hospital of the McGill University Health Centre (MUHC) and a scientist from the... view more 

Credit: McGill University Health Centre (Julie Robert)


Montreal, May 14 2019-- Elwyn was a healthy 13 month-old toddler when she started drinking water from the bathtub. Over time, she became increasingly thirsty and demanded more and more breast milk. For her parents, this seemed like typical behaviour related to a growth spurt. One day, however, they noticed that she was abnormally weak and rushed her to the emergency department. She was diagnosed with type 1 diabetes and had already developed a life-threatening complication of the disease known as diabetic ketoacidosis. She was immediately transferred to the intensive care unit, where she was treated for several days. Now two years old, Elwyn is still recovering, but doing better.

Unfortunately, the late diagnosis of type 1 diabetes and its severe complications is not uncommon. According to a new study led by a team at the Research Institute of the McGill University Health Centre (RI-MUHC), more than 25% of children in Quebec diagnosed with type 1 diabetes already have diabetic ketoacidosis. Their findings, published today in CMAJ Open, indicate this number has been on the rise by two percent per year since 2001.

"The symptoms of type 1 diabetes are not recognized fast enough by the parents, the schools or healthcare providers," says the study's lead author, Dr. Meranda Nakhla, a pediatric endocrinologist at the Montreal Children's Hospital of the MUHC and a scientist from the Child Health and Human Development Program of the RI-MUHC. "A simple blood sugar test is all that is needed to diagnose a child with type 1 diabetes in presence of symptoms such as frequent urination, excessive thirst, weight loss, a lack of energy and constant hunger."

Type 1 diabetes is one of the most common chronic diseases of childhood and affects around 4,000 children in Quebec. It occurs when the pancreas stops producing insulin, an important hormone that helps your body control the level of sugar in the blood. Diabetic ketoacidosis (DKA) is a serious complication of diabetes that occurs when the body produces high level of blood acids that become toxic.

"Diabetic ketoacidosis is generally an avoidable and preventable complication of type 1 diabetes. If caught early, the child is started on insulin, preventing the development of diabetic ketoacidosis," adds Dr. Nakhla, who is also an assistant professor of pediatrics at McGill University.

Researchers looked at the trends of DKA by analyzing data provided by the Institut national de santé publique du Québec (INSPQ), which focused on the diagnosis of type 1 diabetes in patients between the ages of 1 to 17 years, from 2001 to 2014. They identified a total of 5,741 new cases of diabetes among children and adolescents. Overall, 1,471 children presented with DKA at diabetes diagnosis (with a peak between 5 and 11 years old). Researchers also looked at different factors such as age at diabetes diagnosis, biological sex, socioeconomic and rural status.

"We have not yet been able to establish the exact causes of the increased occurrence of DKA in Quebec," explains first study author, Dr. Marie-Ève Robinson, a pediatric endocrinologist who was a research fellow at the Montreal Children's Hospital at the time of the study. "It would appear that access to the front-line health care system could be a factor, especially for people living outside of major cities."

"Our results show that action needs to be taken and underscore the need for awareness campaigns in Quebec, which are now non-existent, about the symptoms of type 1 diabetes among the general public and general practitioners across the province," states Dr. Nakhla.

florida80 05-19-2019 18:12

News Release 17-May-2019

Enzyme may indicate predisposition to cardiovascular disease

Study suggests that people with low levels of PDIA1 in blood plasma may be at high risk of thrombosis; this group also investigated PDIA1's specific interactions in cancer

Fundação de Amparo à Pesquisa do Estado de São Paulo


    Share

 Print  E-Mail


Measuring the blood plasma levels of an enzyme called PDIA1 could one day become a method of diagnosing a person's predisposition to cardiovascular disease even if they are healthy, i.e., not obese, diabetic or a smoker, and with normal cholesterol.

This is suggested by a study published in the journal Redox Biology by Brazilian researchers affiliated with the University of São Paulo (USP), the University of Campinas (UNICAMP) and Butantan Institute.

The investigation was conducted under the aegis of the Center for Research on Redox Processes in Biomedicine (Redoxome), one of the Research, Innovation and Dissemination Centers (RIDCs) funded by the São Paulo Research Foundation (FAPESP). Redoxome is hosted by USP's Chemistry Institute.

"This molecule belongs to the protein disulfide isomerase [PDI] family. Our study showed that people with low plasma levels of PDIA1 have a more inflammatory protein profile and hence run a higher risk of thrombosis. On the other hand, people with high levels of PDIA1 have more 'housekeeping' proteins associated with cell adhesion, homeostasis and the organism's normal functioning," said Francisco Rafael Martins Laurindo, a professor at the University of São Paulo's Medical School (FM-USP) and principal investigator for the study.

The study was conducted during the PhD research of Percíllia Victória Santos de Oliveira with a scholarship from FAPESP.

The group analyzed blood plasma samples from 35 healthy volunteers with no history of chronic or acute disease. None was a smoker or a user of recreational drugs or chronic medication.

Plasma was collected 10-15 times at intervals of days or weeks during a period of 10-15 months. Circulating PDI levels were within a small range for most individuals. Moreover, in a cohort of five individuals, PDIA1 levels were measured three times in a nine-hour period. The variability of the results was again negligible.

"However, the measurements showed that some patients had high levels of PDIA1, while the levels were very low, almost undetectable, in others. When the tests were repeated for the same person over time, these values hardly varied at all," said Laurindo, who heads the Translational Cardiovascular Biology Laboratory at the Heart Institute (InCor) attached to FM-USP's teaching and general hospital (Hospital das Clínicas).

The researchers also measured the levels of PDIA1 in 90 plasma bank samples from patients with chronic cardiovascular disease. The analysis consistently showed low levels of the enzyme.

They then conducted several additional proteomic studies to investigate how the plasma levels of PDIA1 correlated with an individual's protein signature. The adhesion and migration of cultured vein endothelial cells treated with PDIA1-poor plasma were impaired in comparison with those of cells treated with PDIA1-rich plasma.

These results led to the hypothesis that the plasma level of PDIA1 could be a window onto individual plasma protein signatures associated with endothelial function, which could indicate a possible predisposition to cardiovascular disease.

The study also showed no correlation between PDIA1 levels and well-known risk factors for cardiovascular disease, such as triglycerides and cholesterol.

The next steps for the research group include studying PDIA1 levels in conditions such as acute coronary disease, as well as other members of the protein disulfide isomerase family (there are more than 20 PDIs all told), to compare results and confirm whether all these enzymes are potential markers of vulnerability to cardiovascular disease.

Inhibitors

Clinical trials of inhibitors of other PDIs are being conducted by different groups of researchers in several parts of the world. Because these enzymes play various essential roles in cell survival, Laurindo explained, it is important to understand their specific interactions in the cancer context to design inhibitors capable of eliminating tumors with a minimum of toxicity to normal cells.

In another study, published in the American Journal of Physiology-Heart and Circulatory Physiology, the researchers used an antibody to inhibit PDIA1 on the surface of vascular cells and observed the effects of stimulation with several different mechanical forces, such as stretching and alterations to the rigidity of the extracellular matrix.

Resulting from research conducted during Leonardo Yuji Tanaka's postdoctoral internship (https:/​/​bv.​fapesp.​br/​en/​pesquisador/​77430/​leonardo-yuji-tanaka) with support from FAPESP (https:/​/​bv.​fapesp.​br/​en/​bolsas/​151090), the study concluded that surface PDIA1 inhibition affected the cytoskeleton, an intracellular framework of filaments, thereby hindering cell migration.

"PDIA1 is fundamental for the ability of cells to migrate within the organism, and so it mustn't be completely inhibited. When the surface portion, which corresponds to less than 2% of total PDIA1, is silenced, the cell survives but loses fine regulation of cell direction during migration. This can be leveraged in the search for new disease mechanisms and drugs," Laurindo explained.

florida80 05-19-2019 18:13

News Release 17-May-2019

Wearable cooling and heating patch could serve as personal thermostat and save energy


University of California - San Diego


    Share

 Print  E-Mail


IMAGE


IMAGE: Prototype of the cooling and heating patch embedded in a mesh armband. view more 

Credit: David Baillot/UC San Diego Jacobs School of Engineering


Engineers at the University of California San Diego have developed a wearable patch that could provide personalized cooling and heating at home, work, or on the go. The soft, stretchy patch cools or warms a user's skin to a comfortable temperature and keeps it there as the ambient temperature changes. It is powered by a flexible, stretchable battery pack and can be embedded in clothing. Researchers say wearing it could help save energy on air conditioning and heating.

The work is published May 17 in the journal Science Advances.

"This type of device can improve your personal thermal comfort whether you are commuting on a hot day or feeling too cold in your office," said Renkun Chen, a professor of mechanical and aerospace engineering at UC San Diego who led the study.

The device, which is at the proof-of-concept stage, could also save energy. "If wearing this device can make you feel comfortable within a wider temperature range, you won't need to turn down the thermostat as much in the summer or crank up the heat as much in the winter," Chen said. Keeping a building's set temperature 12 degrees higher during the summer, for example, could cut cooling costs by about 70 percent, he noted.

There are a variety of personal cooling and heating devices on the market, but they are not the most convenient to wear or carry around. Some use a fan, and some need to be soaked or filled with fluid such as water.

Chen and a team of researchers at the UC San Diego Jacobs School of Engineering designed their device to be comfortable and convenient to wear. It's flexible, lightweight and can be easily integrated into clothing.

The patch is made of thermoelectric alloys--materials that use electricity to create a temperature difference and vice versa--sandwiched between stretchy elastomer sheets. The device physically cools or heats the skin to a temperature that the wearer chooses.

"You could place this on spots that tend to warm up or cool down faster than the rest of the body, such as the back, neck, feet or arms, in order to stay comfortable when it gets too hot or cold," said first author Sahngki Hong, a UC San Diego mechanical engineering alumnus who worked on the project as a PhD student in Chen's lab.

The researchers embedded a prototype of the patch into a mesh armband and tested it on a male subject. Tests were performed in a temperature-controlled environment. In two minutes, the patch cooled the tester's skin to a set temperature of 89.6 degrees Fahrenheit. It kept the tester's skin at that temperature as the ambient temperature was varied between 71.6 and 96.8 degrees Fahrenheit.

A building block for smart clothing

The ultimate goal is to combine multiple patches together to create smart clothing that can be worn for personalized cooling and heating. So engineers designed a soft electronic patch that can stretch, bend and twist without compromising its electronic function.

The work is a collaboration between several research groups at the UC San Diego Jacobs School of Engineering. Chen's lab, which specializes in heat transfer technology, led the study. They teamed up with nanoengineering professors Sheng Xu, an expert in stretchable electronics, Shirley Meng, an expert in battery technology, Ping Liu, who is also a battery expert, and Joseph Wang, a wearable sensors expert.

The researchers built the patch by taking small pillars of thermoelectric materials (made of bismuth telluride alloys), soldering them to thin copper electrode strips, and sandwiching them between two elastomer sheets.

The sheets are specially engineered to conduct heat while being soft and stretchy. Researchers created the sheets by mixing a rubber material called Ecoflex with aluminum nitride powder, a material with high thermal conductivity.

The patch uses an electric current to move heat from one elastomer sheet to the other. As the current flows across the bismuth telluride pillars, it drives heat along with it, causing one side of the patch to heat up and the other to cool down.

"To do cooling, we have the current pump heat from the skin side to the layer facing outside," Chen explained. "To do heating, we just reverse the current so heat pumps in the other direction."

The patch is powered by a flexible battery pack. It is made of an array of coin cells all connected by spring-shaped copper wires and embedded in a stretchable material.

Saving energy

One patch measures 5 × 5 centimeters in size and uses up to 0.2 watts worth of power. Chen's team estimates that it would take 144 patches to create a cooling vest. This would use about 26 watts total to keep an individual cool on an average hot day (during extreme heat, estimated power use would climb up to 80 watts, which is about how much a laptop uses). By comparison, a conventional air conditioning system uses tens of kilowatts to cool down an entire office.

It's more energy-efficient to cool down an individual person than a large room, researchers noted. "If there are just a handful of occupants in that room, you are essentially consuming thousands of watts per person for cooling. A device like the patch could drastically cut down on cooling bills," Chen said.

The team is now working on patches that could be built into a prototype cooling and heating vest. They hope to commercialize the technology in a few years.

"We've solved the fundamental problems, now we're tackling the big engineering issues--the electronics, hardware, and developing a mobile app to control the temperature," Chen said

florida80 05-19-2019 18:15

'Imagine...' -- our attitudes can change solely by the power of imagination

Roland Benoit and Philipp Paulus together show that our attitudes can be influenced not only by what we actually experience but also by what we imagine

Max Planck Institute for Human Cognitive and Brain Sciences


    Share

 Print  E-Mail


Sometimes in life there are special places that seem to stand out to us - a school playground, perhaps an old church, or that inconspicuous street corner where you were kissed for the first time. Before the kiss you had never even noticed that corner. It's as if the special experience with that beloved person transferred positive emotion to the location. Our attitude towards these places thus suddenly changes - they become valuable to us. But could this also happen purely by the power of imagination rather than by actual experiences? Roland Benoit and Philipp Paulus from the Max Planck Institute for Human Cognitive and Brain Sciences, together with Daniel Schacter from Harvard University, have examined this question in a study published in the journal Nature Communications. They show that our attitudes can be influenced not only by what we actually experience but also by what we imagine. Furthermore, they believe the phenomenon is based on activity in a particular location in the front of our brains, the ventromedial prefrontal cortex.

Participants in their study were first asked to name people that they like very much and also people they don't like at all. In addition, they were asked to provide a list of places that they considered to be neutral. Later, when the participants were lying in the MRI scanner, they were asked to vividly imagine how they would spend time with a much-liked person at one of the neutral places. "So I might imagine myself with my daughter in the elevator of our institute, where she wildly pushes all the buttons. Eventually, we arrive at the roof top terrace, where we get out to enjoy the view," describes first author Roland Benoit, who heads the research group 'Adaptive Memory'.

After the MRI scanning, he and his colleagues were able to determine that the attitudes of the participants towards the places had changed: the previously neutral places that had been imagined with liked people were now regarded more positive than at the beginning of the study. The authors first observed this effect with study participants in Cambridge, MA, and then successfully replicated this effect in Leipzig, Germany. "Merely imagining interacting with a much-liked person at a neutral place can transfer the emotional value of the person to this place. And we don't even have to actually experience the episode in reality," is how co-author Daniel Schacter sums it up.

Using MRI data, the researchers were able to show how this mechanism works in the brain. The ventromedial prefrontal cortex plays an important role in this process. This is where information about individual persons and places from our environment is stored, as the authors assumed. But this region also evaluates how important individual people and places are for us. "We propose that this region bundles together representations of our environment by binding together information from the entire brain that form an overall picture," Roland Benoit explains.

"For example, there would be a representation with information about my daughter - what she looks like, how her voice sounds, how she reacts in certain situations. The idea now is that these representations also include an evaluation - for example, how important my daughter is to me and how much I love her."

Indeed, when the participants thought of a person that they liked more strongly, the scientists saw signs of greater activity in that region. "Now, when I imagine my daughter in the elevator, both her representation and that of the elevator become active in the ventromedial prefrontal cortex. This, in turn, can connect these representations - the positive value of the person can thus transfer to the previously neutral location."

Why are the researchers interested in this phenomenon? They want to better understand the human ability to experience hypothetical events through imagination and how we learn from imagined events much in the same way as from actual experiences. This mechanism can potentially augment future-oriented decisions and also help avoiding risks. According to Benoit, it will be important to also understand the consequences of negative thoughts: "In our study, we show how positive imaginings can lead to a more positive evaluation of our environment. I wonder how this mechanism influences people who tend to dwell on negative thoughts about their future, such as people who suffer from depression. Does such rumination lead to a devaluation of aspects of their life that are actually neutral or even positive?" This could be the next interesting research question for his team.

###

florida80 05-19-2019 18:16

News Release 17-May-2019

Study finds narrowing gender gap in youth suicides

Recent data show a disproportionate increase in the suicide rate among female relative to male youth

Nationwide Children's Hospital


    Share

 Print  E-Mail


New research from Nationwide Children's Hospital finds a disproportionate increase in youth suicide rates for females relative to males, particularly in younger youth aged 10-14 years. The report, which describes youth suicide trends in the United States from 1975 to 2016, appears this week in JAMA Network Open.

Suicide is the second leading cause of death among youth aged 10-19 years in the U.S., with rates historically higher in males than females. However, recent reports from the Centers for Disease Control and Prevention reveal female youth are experiencing a greater percent increase in suicide rates compared to males.

Donna Ruch, PhD, a post-doctoral researcher in the Center for Suicide Prevention and Research at the Research Institute at Nationwide Children's Hospital, examined these trends by investigating suicide rates among U.S. youth aged 10-19 years from 1975 through 2016.

The researchers found that following a downward trend in suicide rates for both sexes in the early 1990s, suicide rates increased for both sexes since 2007, but suicide rates for females increased more. There was a significant and disproportionate increase in suicide rates for females relative to males, with the largest percentage increase in younger females. These trends were observed across all regions of the country.

"Overall, we found a disproportionate increase in female youth suicide rates compared to males, resulting in a narrowing of the gap between male and female suicide rates," said Dr. Ruch.

When the researchers looked at the data by method, they found the rates of female suicides by hanging or suffocation are approaching those of males. This is especially troubling in light of the gender paradox in suicidal behavior, according to Jeff Bridge, PhD, director of the Center for Suicide Prevention and Research at Nationwide Children's and co-author of the new study.

Dr. Bridge said females have higher rates of non-fatal suicidal behavior, such as thinking about and attempting suicide, but more males die by suicide than females.

"One of the potential contributors to this gender paradox is that males tend to use more violent means, such as guns or hanging," said Dr. Bridge. "That makes the narrowing of the gender gap in suicide by hanging or suffocation that we found especially concerning from a public health perspective."

The researchers called for future work to examine whether there are gender-specific risk factors that have changed in recent years and how these determinants can inform intervention.

"From a public health perspective, in terms of suicide prevention strategies, our findings reiterate the importance of not only addressing developmental needs but also taking gender into account," said Dr. Ruch.

Dr. Bridge emphasized that asking children directly about suicide will not trigger subsequent suicidal thinking or behavior.

"Parents need to be aware of the warning signs of suicide, which include a child making suicidal statements, being unhappy for an extended period, withdrawing from friends or school activities or being increasingly aggressive or irritable," he said. "If parents observe these warning signs in their child, they should consider taking the child to see a mental health professional."

florida80 05-19-2019 18:17

News Release 16-May-2019

Study may help prevent relapse in cocaine use disorder patients

Brazilian researchers combined cognitive dysfunction tests with an analysis of drug use patterns to identify patients at high risk of relapse after treatment

Fundação de Amparo à Pesquisa do Estado de São Paulo


    Share

 Print  E-Mail


A study conducted at the University of São Paulo (USP) in Brazil and described in an article published in the journal Drug and Alcohol Dependence may help healthcare workers identify patients who risk relapse after undergoing treatment for cocaine use disorder.

According to the authors, the findings reinforce the need to offer individualized treatment strategies for these cases, which are considered severe.

The principal investigator for the study was Paulo Jannuzzi Cunha, a professor at the University of São Paulo's Medical School (FM-USP), with a postdoctoral research scholarship from São Paulo Research Foundation - FAPESP and support from the National Council for Scientific and Technological Development (CNPq).

For 30 days, the researchers monitored 68 patients admitted for treatment for cocaine dependence to the Psychiatry Institute of Hospital das Clínicas, FM-USP's general and teaching hospital in São Paulo. All patients volunteered to participate in the study. They were contacted three months after discharge to verify their abstinence. All but 14 reported a relapse, defined as at least one episode of cocaine use during the period.

One of the goals of the study was to determine whether the 11 criteria for diagnosing chemical dependence established in the DSM-5 were also effective at predicting the response to treatment. DSM-5 is the latest edition of the Diagnostic and Statistical Manual of Mental Disorders, published by the American Psychiatric Association (APA) and valued as the standard classification by mental health professionals in many parts of the world.

"Our hypothesis was that these criteria would not help accurately predict relapse. However, after completing the study, we recognized that they may in fact be useful for predicting vulnerability to relapse," Cunha told.

The DSM-5 criteria for diagnosing a substance use disorder include taking the drug in larger amounts and/or for longer periods than intended; craving, or a strong desire to use the drug; giving up social, occupational or recreational activities because of substance use; continued use despite having persistent social or interpersonal problems caused or exacerbated by the effects of the drug; tolerance (needing increasing amounts to achieve the desired effect); and withdrawal symptoms, among others.

According to the DSM-5 guidelines, a substance use disorder can be considered mild if two or three of the 11 criteria are met for a year, moderate if four or five are met for a year, or severe if six or more are met for a year.

"Our sample included only cases classified as severe. We observed a clear difference between patients who met six to eight criteria and those who met nine to 11. The relapse rate was significantly higher in the latter group," said Danielle Ruiz Lima, first author of the article.

The results suggested that the three categories recommended by DSM-5 should be reviewed. "There seems to be an 'ultra-severe' group as well as a 'severe' group," they said.

Refining the analysis

Another hypothesis investigated was that the pattern of cocaine use (DSM-5 score plus factors such as age at use onset and intensity of use in the month prior to hospital admission) and the cognitive deficit caused by the drug were related variables that could help predict a relapse after treatment.

The researchers used several tests to assess the participants' performance in executive functioning, which include working memory (required for specific actions, such as those of a waiter who has to associate orders with customers and deliver them correctly), sustained attention (the ability to focus for as long as it takes to complete a specific task, such as filling out a form, without being distracted), and inhibitory control (the capacity to control impulses).

The researchers applied the tests a week after admission on average, as they considered this waiting time sufficient to know whether the urine toxicology was negative. The aim of this procedure was to avoid acute effects of the drug on the organism.

One of the tests used to measure memory span required the patient to repeat an ascending sequence of numbers presented by the researchers. Patients who had made intensive use of cocaine in the previous month before admission underperformed on this test.

In other tests, in this case used to measure selective attention, cognitive flexibility and inhibitory control, patients were asked to repeat a sequence of colors and then to name the ink color when the name of a different color is printed (the word "yellow" printed in blue ink, for example).

"The automatic response is to read what's written instead of naming the ink color, as required. Inhibitory control is essential to perform this task. This function is extremely important in the early stages of drug dependence rehabilitation, when the patient has to deal with craving and situations that stimulate the desire to use the drug," Cunha said.

The analysis showed a correlation between inhibitory control test results and the age at cocaine use onset. "The earlier the onset, the more mistakes they made, which could point to a higher risk of relapse. The idea is to identify the people who have substantial difficulties so that we can work out individualized treatment strategies," Lima said.

Intense cocaine use in the 30 days preceding admission also correlated with underperformance on the inhibitory control test and the working memory test, in which the patient was asked to listen to a sequence of numbers and repeat them in reverse order, reflecting an important function for the manipulation of information as a basis for decision making.

According to the researchers, studies have shown that executive functions may be restored after a period of abstinence, but to what extent and how long this recovery may take are unknown.

The research group at FM-USP advocates cognitive rehabilitation programs to support the recovery process. "We have a motivational chess proposal that's being studied right now," Cunha said. "A therapist plays chess with the patient and then discusses the moves, making analogies with the patient's life in an attempt to transfer the knowledge acquired in the chess game to day-to-day experience and train inhibitory control, planning and healthy decision making in the real world."

In his view, the assessment of cognitive deficits is important for both diagnosing substance use disorder and predicting relapse. "Chemical dependence is a brain disease, and these neuropsychological tests offer a slide rule with which to measure objectively whether the damage is mild, moderate or severe, along similar lines to the classification of dementia," he said.

According to the researchers, despite the clinical relevance of neuropsychological test results, they are not part of the DSM-5 criteria, which are based solely on self-reported factors and clinical observations.

"We hope the sixth edition of the DSM takes these developments into account in recognition of the research done by us and other groups worldwide," Cunha said.

###

florida80 05-19-2019 18:18

News Release 17-May-2019

Integrated stepped alcohol treatment for people in HIV care improves both HIV & alcohol outcomes


NIH/National Institute on Alcohol Abuse and Alcoholism


    Share

 Print  E-Mail


New clinical research supported by the National Institutes of Health shows that increasing the intensity of treatment for alcohol use disorder (AUD) over time improves alcohol-related outcomes among people with HIV. This stepped approach to AUD treatment also improves HIV-related disease measures in this patient population. A report of the new study, led by researchers at Yale University, is now online in The Lancet HIV.

"These research findings demonstrate the potential of integrated treatment for AUD and HIV in improving health outcomes," said George F. Koob, Ph.D., director of the NIH's National Institute on Alcohol Abuse and Alcoholism (NIAAA), which provided primary funding for the new research, with additional funding provided by the National Institute on Drug Abuse (NIDA). "Moreover, it underscores the importance of integrating treatment for alcohol problems into mainstream health care."

In the United States, estimates of the prevalence of people with HIV who either drink heavily, or who have AUD, range from 8 percent to 42 percent. Alcohol misuse can increase risky behaviors that increase the likelihood of acquiring HIV or transmitting it to others. Alcohol misuse may also speed the progression of HIV in individuals with HIV infection and make it harder to follow medication regimens.

"Many people with HIV are unaware of, or not seeking treatment for, their alcohol use problems," said first author E. Jennifer Edelman, M.D., M.H.S., associate professor of medicine at Yale School of Medicine. "In addition, HIV clinicians often do not realize that there are effective medications and counseling that they can easily integrate into their practice for patients with alcohol use problems."

Noting that previous studies have found that integrating the treatment of opioid use disorder into HIV clinics improves both HIV and substance-related outcomes, the researchers wanted to evaluate whether such a model would similarly benefit people with HIV and AUD.

Treatment for AUD often occurs apart from an individual's HIV clinical care. The current study integrates the treatment for AUD with treatment for HIV.

Dr. Edelman and her colleagues conducted a randomized clinical trial in five Veterans Affairs-based HIV clinics with 128 people who had HIV infection and AUD. The researchers investigated integrated stepped alcohol treatment (ISAT) -- an approach that involved consecutive steps of increased AUD treatment intensity if lower intensity treatment did not produce desired results.

People in the ISAT group started their AUD treatment with an on-site addiction psychiatrist, focusing on the use of medications for AUD. If that step did not stop heavy drinking, the next step included the addition of a behavioral intervention conducted on-site to boost motivation to change drinking behavior and teach coping skills for managing high-risk situations. Researchers defined heavy drinking as five drinks or more per day for men and four drinks or more per day for women, on one or more days during the previous 14 days. Patients who continued to engage in heavy drinking were advanced to the final step of referral to specialty addiction treatment -- such as intensive outpatient treatment or residential treatment depending on locally available resources. Patients in the control group received treatment as usual - which included alcohol screening, brief intervention, and referral to specialty addiction treatment at the VA at the discretion of their HIV clinician.

At the end of the six-month study, while both groups reported reduced alcohol intake, the researchers found no differences in drinks per week or HIV outcomes between the ISAT and control groups. Both groups then continued AUD treatment under treatment as usual (control) conditions. At the 12-month follow up, individuals who had initially received ISAT were found to have fared better than individuals who only received treatment as usual. People in the ISAT group, for example, reported having fewer drinks per drinking day than people in the control group and a greater percentage of days abstinent. The ISAT group also had a higher percentage of people who reported no heavy drinking days.

"Importantly, we also observed that participants randomized to stepped AUD treatment were more likely to achieve an undetectable HIV viral load," said Dr. Edelman. "We believe that with decreased alcohol consumption, participants in the ISAT group were more likely to take their HIV medications consistently, translating into improved HIV viral control."

In an invited commentary on the new research in The Lancet HIV, co-authors Lorenzo Leggio, M.D., Ph.D., Senior Investigator in the NIH Intramural Research Program at NIAAA and NIDA, and Roberta Agabio, M.D., a psychiatrist at the University of Cagliari in Italy welcomed the new findings as important for the HIV field and beyond.

"Stepped care approaches have been found to be effective for treating a variety of chronic diseases," said Dr. Leggio.

"These findings are a first indication of their potential value for treating AUD in the context of HIV treatment. The results warrant further investigation on how to optimize its use among people with HIV, and to explore its integration in other medical care settings. Indeed, the study is a compelling example of the need for trained clinicians across the spectrum of health care to recognize and treat AUD as a medical disorder amenable to a variety of treatment approaches."

###

florida80 05-19-2019 18:18

News Release 17-May-2019

'Stepped' treatment reduces drinking in patients with HIV


Yale University


    Share

 Print  E-Mail


New Haven, Conn. -- People with HIV who drink too much were more likely to reduce drinking after undergoing an approach to care known as integrated stepped alcohol treatment, according to a Yale-led study.

The finding supports greater use of this treatment model in HIV clinics to improve outcomes for patients with both HIV and drinking problems, the researchers said.

The study was published in The Lancet HIV.

Stepped care is used to treat some patients with chronic diseases such as hypertension and depression. It entails the use of different treatments that are "stepped up," or increased in intensity over time, in response to patients' needs. Prior to this new study, little research had been done to evaluate the impact of stepped care for patients struggling with alcohol use disorder, and none had been conducted in HIV treatment settings, the researchers said.

The research team recruited 128 individuals from one of five Veterans Affairs-based HIV clinics. They randomized the patients into one of two groups -- those given integrated stepped alcohol treatment and an equal number receiving treatment as usual.

The stepped-care patients were offered evidence-based treatments, including medication, motivational therapy, and specialty care at either an outpatient or residential treatment facility. By comparison, the treatment-as-usual patients were referred to specialty addiction treatment at the VA at the discretion of their HIV clinician.

At the end of the study period, the researchers found that patients who received integrated stepped care fared better overall. After 52 weeks, stepped-care patients had fewer heavy drinking days, drank less per drinking day, and had more days of abstinence, the researchers noted.

"We saw overall improvements in drinking," said Jennifer Edelman, M.D., lead author and associate professor in internal medicine. "We also found improved HIV outcomes at the 52-week mark."

The improvements in patients' HIV status were presumably associated with the reduced alcohol use, Edelman noted. "Over time, the patients receiving integrated stepped care showed decreases in alcohol use and a higher rate of undetectable HIV viral load, likely related to improved HIV medication adherence," she said.

The study results support the expanded use of integrated stepped care for alcohol misuse in settings where patients are already being treated for HIV, the researchers said.

florida80 05-19-2019 18:19

News Release 15-May-2019

Legal marijuana reduces chronic pain, but increases injuries and car accidents

Overall hospital stays remain steady after colorado legalized cannabis, UCSF study finds

University of California - San Francisco


    Share

 Print  E-Mail


The legalization of recreational marijuana is associated with an increase in its abuse, injury due to overdoses, and car accidents, but does not significantly change health care use overall, according to a study by researchers at UC San Francisco.

In a review of more than 28 million hospital records from the two years before and after cannabis was legalized in Colorado, UCSF researchers found that Colorado hospital admissions for cannabis abuse increased after legalization, in comparison to other states. But taking the totality of all hospital admissions and time spent in hospitals into account, there was not an appreciable increase after recreational cannabis was legalized.

The study, appearing online May 15, 2019, in BMJ Open, also found fewer diagnoses of chronic pain after legalization, consistent with a 2017 National Academy of Sciences report that concluded substantial evidence exists that cannabis can reduce chronic pain.

"We need to think carefully about the potential health effects of substantially enhancing the accessibility of cannabis, as has been done now in the majority of states," said senior author Gregory Marcus, MD, MAS, a UCSF Health cardiologist and associate chief of cardiology for research in the UCSF Division of Cardiology.

"This unique transition to legalization provides an extraordinary opportunity to investigate hospitalizations among millions of individuals in the presence of enhanced access," Marcus continued. "Our findings demonstrate several potential harmful effects that are relevant for physicians and policymakers, as well as for individuals considering cannabis use."

According to the 2014 National Survey on Drug Use and Health, more than 117 million Americans, or 44.2 percent of all Americans, have used cannabis in their lifetime, and more than 22 million Americans report having used it within the past 30 days. While its use is a federal crime as a controlled substance, 28 states and the District of Columbia now allow it for treating medical conditions. Nine of those states have legalized it for recreational use.

To understand the potential shifts in health care use resulting from widespread policy changes, Marcus and his colleagues reviewed the records of more than 28 million individuals in Colorado, New York and Oklahoma from the 2010-2014 Healthcare Cost and Utilization Project, which included 16 million hospitalizations. They compared the rates of health care utilization and diagnoses in Colorado two years before and two years after recreational marijuana was legalized in December 2012 to New York, as a geographically distant and urban state, and to Oklahoma, as a geographically close and mainly rural state.

The researchers found that after legalization, Colorado experienced a 10 percent increase in motor vehicle accidents, as well as a 5 percent increase in alcohol abuse and overdoses that resulted in injury or death. At the same time, the state saw a 5 percent decrease in hospital admissions for chronic pain, Marcus said.

"There has been a dearth of rigorous research regarding the actual health effects of cannabis consumption, particularly on the level of public health," said Marcus, holder of the Endowed Professorship of Atrial Fibrillation Research in the UCSF School of Medicine. "These data demonstrate the need to caution strongly against driving while under the influence of any mind-altering substance, such as cannabis, and may suggest that efforts to combat addiction and abuse of other recreational drugs become even more important once cannabis has been legalized."

The study findings may be beneficial in guiding future decisions regarding cannabis policy, the researchers said.

"While it's convenient and often most compelling to simplistically conclude a particular public policy is 'good' or 'bad,' an honest assessment of actual effects is much more complex," Marcus said. "Those effects are very likely variable, depending on each individual's idiosyncratic needs, propensities and circumstances. Using the revenues from recreational cannabis to support this sort of research likely would be a wise investment, both financially and for overall public health."

The researchers could not explain why overall health care utilization remained essentially neutral, but said the harmful effects simply may have been diluted among the much larger number of total hospitalizations. They said it also may be that some beneficial effects, either at the individual or societal level, such as violent crime, counterbalanced the negatives.

###

florida80 05-19-2019 18:20

News Release 14-May-2019

Preclinical study: Probiotic-derived molecule may suppress fatal brain inflammation

City of Hope scientists explore how swallowing a 'bacterial envelope molecule' may prime your body to fend off viral infections




DUARTE, Calif. -- The existence of certain microorganisms in your gut may bolster the immune system's ability to fend off a herpes viral attack that can cause fatal brain inflammation, reports a new City of Hope-led study.

Researchers say the findings are the first to suggest that an envelope molecule from a bacterium called Bacteroides fragilis (B. fragilis) might be useful against viral inflammatory diseases. Called capsular polysaccharide A (PSA), the envelope molecule appears to promote protective, anti-inflammatory responses during a viral infection, said Ramakrishna Chandran, Ph.D., and Edouard Cantin, Ph.D., authors of the study and virology and immunology experts at City of Hope.

"This mouse study shows that B. fragilis PSA can temper the immune system so that infection does not result in an uncontrolled, potentially fatal inflammatory response in the brain," Cantin said. "Although herpes simplex encephalitis is a rare brain inflammation disorder, the lessons we learned here might, with more research, be applicable to other viral infections such as other herpesviruses, influenza virus, West Nile virus and maybe even viral respiratory diseases - conditions where inflammation begins to jeopardize the health of your body and brain function."

Herpes simplex encephalitis affects about 2,000 people in the United States each year and has a high mortality rate if symptoms are not recognized and patients aren't treated promptly; survivors usually have serious neurological conditions. About 70% of untreated individuals die, according to multiple scientific reports.

The study, published in Nature Communications on May 14, found that B. fragilis' bacterial envelope, PSA, brings forth regulatory T and B cells that suppress the immune system from overproducing harmful inflammatory responses triggered by herpes simplex virus infection. In other words, PSA reduced brainstem inflammation by promoting the appearance of IL-10 secreting regulatory T and B cells. IL-10 is a strong anti-inflammatory cytokine which creates a protective, anti-inflammatory response that prevents encephalitis.

"It's possible that consumption of certain prebiotics, probiotics or synbiotics may enhance your body's natural ability to suppress inflammatory diseases," Chandran said. "Our study provides an exciting proof of principle that needs further research validation, but it seems reasonable that what you decide to eat may affect your overall health and ability to fight off disease."

Chandran, Cantin and their colleagues found that mice pretreated with the candidate probiotic, B. fragilis or PSA, survived a lethal herpes simplex virus infection, whereas mice pretreated with a placebo did not survive despite the fact that both groups were given Acyclovir, an antiviral that is the standard of care for herpes simplex virus encephalitis. The finding suggests that the probiotic-derived PSA optimizes the immune system to fight against viruses, especially those that induce damaging inflammation.

The researchers reported on the important role B cells play in extinguishing inflammation. B cells are a type of white blood cell that secretes antibodies. When the scientists depleted mice of their B cells prior to PSA treatment, the mice lost their ability to marshal regulatory T cells and to secrete the anti-inflammatory cytokine IL-10. The researchers showed that B cells bound PSA, and this was crucial for induction of protective regulatory T cells, which secrete the anti-inflammatory cytokine IL-10. Thus, eliminating B cells rendered the immune system weaponless in the fight against fatal herpes simplex virus brain inflammation.

###

florida80 05-19-2019 18:22

News Release 9-May-2019

Neurodevelopmental disorders may be rooted in genetics and mitochondrial deficits

In a model of DiGeorge/22q11 Deletion Syndrome, George Washington University researchers found a connection between mitochondrial dysfunction and cortical under-connectivity and cognitive impairment; Function was restored through antioxidant therapy

George Washington University


 


Under-connectivity, or too few connections in the brain, is the underlying cause of brain disorders like autism and schizophrenia, according to a recent study from investigators at the George Washington University (GW) Institute for Neuroscience. The study, published in Neuron, provides the first evidence showing that individual nerve cells fail to make the right number of connections. The reason for this deficit is limited growth of key nerve cells in the cerebral cortex during early development, due to both genetics and mitochondrial dysfunction.

Working in models of DiGeorge/22q11 Deletion Syndrome -- a common disorder with the highest known genetic association with diseases like schizophrenia and autism -- the GW group defined the disruptions of cell and molecular functions that lead to altered development of nerve cells and their connections in the cerebral cortex. They associated these changes with behavioral deficits linked to neurodevelopmental disorders. This work confirms, for the first time, a well-known clinical hypothesis that under-connectivity is the basis of these disorders.

The work from the GW research team also showed that cortical under-connectivity and cognitive impairment are linked to genes that cause mitochondrial dysfunction in DiGeorge/22q11 Deletion Syndrome. When mitochondrial function is restored through antioxidant therapy, so are cortical connections and behavioral deficits.

"The good news is that mitochondrial deficits are very treatable pharmacologically or through diet" said senior author Anthony-Samuel LaMantia, PhD, director of the GW Institute for Neuroscience and Jeffrey Lieberman Professor of Neurosciences at the GW School of Medicine and Health Sciences. "Our research holds up the possibility that in some instances, children who are diagnosed with neurological disorders may have genetic deficits that lead to a final common pathway of focal metabolic disruption -- ultimately, this is treatable."

Using a DiGeorge/22q11 Deletion Syndrome mouse model, the GW research team first sought to confirm that under-connectivity, not over-connectivity, underlies behavioral deficits. They found that the integrity and efficiency of synapses in the cortex were diminished. The GW team then looked at the cells making the connections. They found the cells were unhealthy due to dysfunctional mitochondria, long known to be the powerhouse of the cell. The research team then tested the theory that mitochondria in these cells might be dysfunctional due to increased reactive oxygen species -- oxygen molecules that roam freely through cells and cause extensive damage. Finally, the team used antioxidant therapy to neutralize these dangerous oxygen "free radicals" to help restore mitochondrial function. This therapy not only fixed connectivity, but fixed the behavioral deficits that happened as a result.

The team then looked at the 28 genes on chromosome 22 for which one of two copies is lost in individuals with DiGeorge/22q11 Deletion Syndrome. They focused on six of these 28 genes associated with mitochondria. They identified the Txrnd2 gene, which encodes an enzyme that neutralizes reactive oxygen in mitochondria, as a critical player in the growth and connectivity of the cortical cells that are under-connected.

"This is one of the first times that any group has gone from genetic mutation, to cell biological pathology, to behavioral consequences, and then to safe, effective therapy that corrects both the pathology and behavioral impairment in a valid animal model of any neurodevelopment disorder," said LaMantia


All times are GMT. The time now is 23:22.

VietBF - Vietnamese Best Forum Copyright ©2005 - 2025
User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2025 DragonByte Technologies Ltd.

Page generated in 0.21496 seconds with 8 queries