06.10.2020 Views

YSM Issue 93.2

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Yale Scientific

THE NATION’S OLDEST COLLEGE SCIENCE PUBLICATION • ESTABLISHED IN 1894

SEPTEMBER 2020 VOL. 93 NO. 2 • $6.99

16

THE ROLE OF DEATH

IN PLANT LIFE

DRIVERS PREDICTING

13

THE FUTURE

DOGS ON

21

DUTY

KEEPING DRY

26

UNDERWATER

THE NEW

28

98.6 DEGREES


T A B L E O F

VOL. 93 ISSUE NO. 2

More articles online at www.yalescientific.org & https://medium.com/the-scope-yale-scientific-magazines-online-blog

COVER

16 The Role of Death in Plant Life

A R T

I C L E

Cindy Kuang

As agriculture adapts to fit modern demands, organizations such as the USDA are pushing forth

policies based on the idea that more soil organic matter will lead to increased crop productivity.

Yale researchers investigate the scientific basis behind this idea.

10 Tracking Drugs

Anna Sun

Yale researchers from the Department of Biomedical Engineering have developed a quantitative

microscopy approach to efficiently measure circulation half-life of fluorescently tagged agents.

With the power to detect subtle changes in concentrations for nanoparticles and antibodies,

scientists may soon be able to rapidly screen a wide range of therapeutic agents for clinical trials.

13 Drivers Predicting the Future

Maria Fernanda Pacheco

Yale researchers create a model that uses driver mutations to estimate tumor growth patterns and

to predict the development of cancer.

18 Building Better Qubits

Shoumik Chowdhury

A key step towards building robust quantum computers is designing quantum bits (qubits)

protected against noise. We report on a recent development from quantum information

researchers at Yale that demonstrates a new kind of qubit that may achieve this.

2 Yale Scientific Magazine September 2020 www.yalescientific.org


C O N T E N T S

4

6

20

28

30

Q&A

NEWS

FEATURE

SPECIALS

PROFILES

Why Does Stress Turn Your Hair Gray? • Anmei Little

Does Being Kind Make You Feel Less Pain? • Selma Abouneameh

Let Them Grow • Athena Stenor

Implications of Impella • Mia Jackson

Tectonic Plate Deformation • Dhruv Patel

Analyzing Autoinducer-3 • Jerry Ruvalcaba

The Duality of the Ebola Virus • Sydney Hirsch

Micro Molecule, Huge Impact • Beatriz Horta

Show and Tell • Eva Syth

Dogs on Duty • Makayla Conley

An Algorithmic Jury • Mirilla Zhu

Letting Experience Guide the Way • Zoe Posner

Keeping Dry Underwater • Yu Jun Shen

Counterpoint: The New 98.6 Degrees • Kelly Farley

Science vs. The Apocalypse: Antibiotic Resistance • Victoria Vera

Undergraduate: Alon Millet (BR '20) • Katherine Dai

Alumni: Tze-Chiang Chen (PhD '85) • Nadean Alnajjar

www.yalescientific.org

September 2020 Yale Scientific Magazine 3


DOES BEING KIND MAKE

YOU FEEL LESS PAIN?

&

WHY DOES STRESS TURN

YOUR HAIR GRAY?

By Anmei Little

Have you ever noticed that politicians and CEOs develop

grey hairs fairly quickly? Usually the growth of grey

or white hairs accompanies old age due to the natural

depletion of melanocyte stem cells (MeSCs), which are responsible

for the hair pigmentation. A recent study by Ya-Chieh Hsu’s lab at

the Harvard Stem Cell Institute discovered that acute stress caused

mice to develop grey hairs by the same process as aging: MeSC

depletion. This could explain the premature greying of politicians,

CEOs, and others who experience stress on a daily basis.

In their experiment, the researchers stimulated stress in mice by

injecting them with a chemical called resiniferatoxin (RTX). RTX

acts similarly to the compound in chili peppers that is responsible

for the burning sensation of spice. In response to this stressor, the

researchers observed that the sympathetic nervous system in mice

became overstimulated and activated a “fight-or-flight” response,

releasing a neurotransmitter called noradrenaline. Noradrenaline

caused MeSCs to multiply and migrate, depleting the reservoir

and leading to premature greying. Interestingly enough, if

noradrenaline release was blocked, the mice’s hair did not lose

color. This suggests that noradrenaline is associated with, perhaps

even necessary for, the mechanism for hair greying.

While it may sound like bad news that your stress spawns

grey hairs, scientists are deeply interested in further exploring

the mechanisms of MeSC depletion. Future research may well

discover a way to prevent both stress-induced and age-induced

greying. To the next generation of politicians and leaders: do not

fret—the era of hair dye might pass soon. ■

By Selma Abouneameh

From morphine to ibuprofen, modern medicine has made

enormous progress in the field of pain management.

However, a recent study conducted by Professor Xiaofei

Xie’s lab at Peking University showed that medicine might not be

the only means by which we can attain pain relief. The proposed

alternative? Helping others.

Why do people help others? The question has plagued

scientists for centuries—after all, altruism is a costly behavior.

The researchers’ study sought to address this paradox. “Our

experiments suggest that altruism is not just other-benefiting, but

it benefits the performers as well,” said Yilu Wang, lead author of

the study. According to their results, a key benefit is pain relief.

In order to get a deeper understanding into how altruism

affects our biology, the authors conducted three experiments

that placed individuals in either altruistic or non-altruistic roles,

then either induced pain or measured naturally existing pain. In

one of these experiments, the researchers used functional MRI

(fMRI) to measure how brain activity changed after participants

performed the altruistic task of donating money to orphans.

The fMRI results showed that those who had donated money

exhibited decreased activity in the right insula, the area of the

brain responsible for feeling pain.

Thus, performing altruistic behaviors regularly can serve as

a “low-cost, side effect-free approach to supplement current

therapies for chronic pain,” Wang and Xie said. Their research

not only sheds light on different psychological and biological

motivations behind our behaviors but may also provide insight

into a new method for pain management. ■

Wang, Y., Ge, J., Zhang, H., Wang, H., & Xie, X. (2020). Altruistic

behaviors relieve physical pain. Proceedings of the National

Academy of Sciences, 117(2), 950-958.

Zhang, B., Ma, S., Rachmin, I., He, M., Baral, P., Choi, S., Gonçalves, W. A.,

Shwartz, Y., Fast, E. M., Su, Y., Zon, L. I., Regev, A., Buenrostro, J. D., Cunha,

T. M., Chiu, I. M., Fisher, D. E., & Hsu, Y.-C. (2020). Hyperactivation of

sympathetic nerves drives depletion of melanocyte stem cells. Nature,

577(7792), 676–681. https://doi.org/10.1038/s41586-020-1935-3

4 Yale Scientific Magazine September 2020 www.yalescientific.org


The Editor-in-Chief Speaks

WORKING THROUGH CRISES

Last issue, I touted the Yale Scientific Magazine’s steadfast commitment to

communicate science “through . . . epoch-making events,” thanks to our forebears’

tireless efforts over the past 126 years. Through the events of the past few months,

we at YSM have truly come to understand what it means to push forward in our

mission, even as Yale and the world have been rocked by crisis after crisis.

It is thanks to an incredible feat of dedication and passion by our masthead

and contributors that we are publishing this issue. Writers and editors conducted

interviews and worked on drafts through spring break, against unprecedented

academic uncertainty. Thanks to our webmasters, the content for this issue has

been online since May; the production and business teams have also been working

tirelessly in their respective domains to ensure a successful launch for Issue 93.2.

This issue’s cover article by Cindy Kuang (page 16) gives an incisive account of

a research study to determine how, and whether, soil organic matter contributes

to crop productivity. We continue to highlight the best research from all corners

of the Yale campus, be it harnessing the deadly Ebola virus to treat brain tumors

(page 8), or building better superconducting qubits for quantum computing (page

18). Two Features articles in this issue are of note. One addresses the implications

of an AI-driven jury (page 22), while the other covers the Angiosarcoma Project,

a pioneering model for public and patient involvement in disease research (page

24). These represent a new effort to spotlight the social impacts of scientific

advances, and we will tell more stories like these in the future.

The past few months have given us the COVID-19 pandemic and the tragedies

leading to the Black Lives Matter movement. COVID-19 will change how we

live our lives and how we do science. In a short span of months, we have gone

from knowing nothing about SARS-CoV-2 to developing promising vaccines.

YSM has attempted to capture this rapid—albeit meandering—progress in

our summer COVID-19 Catch-Up series, which covers the latest COVID-19

literature on our social media. On the other hand, the Black Lives Matter

movement has made us recognize the dearth of recognition and participation

by minority groups in science. As a science publication, YSM needs to do more.

In our future coverage, we will strive to highlight more underrepresented voices

in science. You can find our full statement on our website.

Wherever you are, please stay safe, take care, and wear a mask.

ABOUT THE ART

Marcus Sak, Editor-in-Chief

Soil is arguably nature’s underdog.

The substance reinforces extensive

plant root systems, houses a

diversity of fauna, and acts as a

key player within nutrient cycles,

yet its multifacetedness is often

not fully acknowledged throughout

our day-to-day lives. In this

issue’s cover, I explore soil as the

focal point of the illustration.

Sophia Zhao, Cover Artist

MASTHEAD

September 2020 VOL. 93 NO. 2

EDITORIAL BOARD

Editor-in-Chief

Managing Editors

News Editor

Features Editor

Articles Editor

Online Editors

Copy Editors

Scope Editors

PRODUCTION & DESIGN

Production Manager

Layout Editor

Art Editor

Cover Artist

Photography Editor

Webmasters

Social Media Coordinator

BUSINESS

Publisher

Operations Manager

Advertising Managers

OUTREACH

Synapse Presidents

Synapse Vice President

Outreach Coordinators

STAFF

Selma Abouneameh

Nadean Alnajjar

Shoumik Chowdhury

Makayla Conley

Katherine Dai

Maria Fernanda Pacheco

Sydney Hirsch

Beatriz Horta

ADVISORY BOARD

Priyamvada Natarajan

Sandy Chang

Kurt Zilm, Chair

Fred Volkmar

Stanley Eisenstat

James Duncan

Stephen Stearns

Jakub Szefer

Werner Wolf

John Wettlaufer

William Summers

Scott Strobel

Robert Bazell

Craig Crews

Ayaska Fernando

Robert Cordova

Mia Jackson

Cindy Kuang

Anmei Little

Dhruv Patel

Zoe Posner

Jerry Ruvalcaba

Noora Said

Marcus Sak

Kelly Farley

Anna Sun

Xiaoying Zheng

Hannah Ro

James Han

Tiffany Liao

Maria Fernanda Pacheco

Nithyashri Baskaran

Serena Thaw-Poon

Lorenzo Arvanitis

Brett Jennings

Antalique Tran

Julia Zheng

Ellie Gabriel

Sophia Zhao

Kate Kelly

Siena Cizdziel

Matt Tu

Megan He

Sebastian Tsai

Jenny Tan

Stephanie Hu

Cynthia Lin

Michelle Barsukov

Katherine Dai

Chelsea Wang

Nadean Alnajjar

Blake Bridge

Yu Jun Shen

Anastasia Shilov

Ishani Singh

Athena Stenor

Eva Syth

Victoria Vera

Mirilla Zhu

Astronomy

Biological and Biomedical Sciences

Chemistry

Child Study Center

Computer Science

Diagnostic Radiology

Ecology & Evolutionary Biology

Electrical Engineering

Emeritus

Geology & Geophysics

History of Science, Medicine, & Public Health

Molecular Biophysics & Biochemistry

Molecular, Cellular, & Developmental Biology

Molecular, Cellular, & Developmental Biology

Undergraduate Admissions

Yale Science & Engineering Association

The Yale Scientific Magazine (YSM) is published four times a year by Yale

Scientific Publications, Inc. Third class postage paid in New Haven, CT

06520. Non-profit postage permit number 01106 paid for May 19, 1927

under the act of August 1912. ISN:0091-287. We reserve the right to edit

any submissions, solicited or unsolicited, for publication. This magazine is

published by Yale College students, and Yale University is not responsible for

its contents. Perspectives expressed by authors do not necessarily reflect the

opinions of YSM. We retain the right to reprint contributions, both text and

graphics, in future issues as well as a non-exclusive right to reproduce these

in electronic form. The YSM welcomes comments and feedback. Letters

to the editor should be under two hundred words and should include the

author’s name and contact information. We reserve the right to edit letters

before publication. Please send questions and comments to yalescientific@

yale.edu. Special thanks to Yale Student Technology Collaborative.


NANOMOLDED

INTO PERFECTION

SCIENTISTS DISCOVER NOVEL WAY

TO PRODUCE NANOMATERIALS

BY ATHENA STENOR

ILLUSTRATION COURTESY OF ELLI E GABRIEL

Increasingly today, nanotechnology is coming into our lives,

allowing us to design more efficient batteries, build lightweight

vehicles, and even administer needleless vaccines. However, a

major barrier to accessing the full potential of nanoscale materials

is their fabrication. A team of researchers from Yale University and

Wuhan University recently discovered that thermomechanical

nanomolding is a reliable method for the nanofabrication of

ordered phases (OPs).

OPs, a class of materials consisting of sublattices occupied by

atoms, are fundamental to various functional applications. Most

functional materials, including superconductors, magnetic materials

and plasmonic materials, belong to this class. Creating nanoscale

OPs is challenging because traditional techniques, such as chemical

vapor deposition growth, are not practical. For example, chemical

vapor deposition growth, a technique in which a film of vaporized,

decomposing chemical is deposited onto a substrate’s surface, only

works for easily vaporized OPs and cannot produce certain shapes.

But thermomechanical nanomolding can be scaled up for mass

production, fine-tuned to obtain specific characteristics, and used

with a variety of starting materials. In this technique, raw material was

pushed into a nanomold at steady pressure and a temperature below

its melting point, to produce single-crystalline nanowires of consistent

composition and structure. The process could be tuned so that the

nanowires’ aspect ratios were high, enabling easier access to different

morphologies. The researchers attribute their results to OPs’ tendency

to self-organize via a thermodynamic (stability-driven) rather than

kinetic (speed-driven) mechanism. This new process will make OPs a

more accessible class of nanomaterials, and enable the development of

exciting new applications for nanotechnology. ■

Liu, N., Xie., Y., Liu, G., Sohn, S., Raj, A., Han, G., Wu, B., Cha,

J.J., Liu, Z., & Schroers, J. (2020). General Nanomolding of

Ordered Phases. Physical Review Letters, 124(3). https://doi.

org/10.1103/physrevlett.124.036102

IMPLICATIONS

OF IMPELLA

RETHINKING TREATMENTS FOR

CARDIOGENIC SCHOCK

BY MIA JACKSON

ILLUSTRATION COURTESY OF NOORA SAID

More than one million Americans have a heart attack each

year. Four to twelve percent of these patients develop

cardiogenic shock, a condition that prevents one’s heart

from pumping the blood necessary to meet the body’s needs.

Two common treatments for cardiogenic shock include the

left ventricular assist device (LVAD) and the intra-aortic balloon

pump (IABP). While LVADs are more commonly used, Yale

University researchers recently reported the results of a study

examining the discrepancies in in-hospital clinical outcomes

between the two devices.

The study found that LVADs were associated with higher inhospital

mortality and in-hospital major bleeding. “Because there

is relatively little evidence in this area, we think that the analysis

that we did, and the conclusion that we made … would have

both policy implications and regulatory implications,” explained

Nihar Desai, an assistant professor at the Yale School of Medicine

who worked on this study.

Given the two-and-a-half-fold increase in the utilization of

LVADs between 2015 and 2017, Desai’s research suggests that

hospitals might need to rethink treatment plans for cardiogenic

shock. “We hope everyone is trying to integrate these data into

their practice, as it is on everyone to be a little more judicious of

[LVADs], given little evidence that would support their use,” Desai

said. Hopefully, this research will influence the way many doctors

treat cardiogenic shock patients. After all, for every patient that

enters the emergency room, the most common treatment option

available should be the best one for them. ■

Desai, N.R. (2020). Association of Use of an Intravascular

Microaxial Left Ventricular Assist Device vs Intra-aortic

Balloon Pump With In-Hospital Mortality and Major

Bleeding Among Patients With Acute Myocardial Infarction

Complicated by Cardiogenic Shock. JAMA, 323(8), 734–45.

https://doi.org/10.1001/jama.2020.0254

6 Yale Scientific Magazine September 2020 www.yalescientific.org


WHAT’S IN THE

EARTH’S MANTLE?

HOW WEAKER MATERIAL IN OLDER PLATES

MAY INCREASE DEFORMATION

BY DRUHV PATEL

ILLUSTRATION COURTESY OF ANMEI LITTLE

Tectonic plates and subducting slabs, portions of

tectonic plates that have slid under other plates, hold

the secrets to the movement of land masses. Previous

evidence showed that older slabs, which are colder and

supposedly stronger, deform in the mantle more than warm,

and presumably weaker, plates. This phenomenon puzzled

researchers and scientists. Professor Jennifer Girard of Yale

University’s Department of Earth and Planetary Sciences,

along with a team of researchers including Anwar Mohiuddin

and Shun-ichiro Karato, aimed to uncover the basis for this

unexpected deformation. They used a high-pressure and

high-temperature press to simulate the conditions in the

Earth’s mantle, allowing them to study the deformation and

subduction of slabs on a smaller scale.

The team found that when a subducting slab made

mostly of large olivine mineral plunges into the mantle, the

increased pressure causes olivine to transform into finegrained

ringwoodite. “The study clearly shows that newly

formed fine-grained ringwoodite is significantly weaker

than the coarse-grained olivine,” Girard said. While newly

formed ringwoodite in cold slabs grows slowly, higher

temperatures in warmer slabs cause grains of ringwoodite

to grow much faster; this causes the young slabs to become

much stronger as the grain grows. In fact, the team believes

that this inhibited growth rate may be the reason that cold

slabs deform while warmer slabs do not. These findings

will help researchers further explore and understand the

unexpected behavior of tectonic plates. ■

Mohiuddin, A., Karato, S. & Girard, J. (2020). Slab

weakening during the olivine to ringwoodite transition

in the mantle. Nature Geoscience, 13, 170–4. https://doi.

org/10.1038/s41561-019-0523-3

www.yalescientific.org

ANALYZING

AUTOINDUCER-3

UNDERSTANDING

CRITICAL BACTERIA

BY JERRY RUVALCABA

ILLUSTRATION COURTESY OF SOPHIA ZHAO

Escherichia coli is a dominant bacterial member of the human

intestinal tract and a major model organism in biology. Some

members of E. coli contribute to a healthy gut ecosystem, whereas

others are pathogens causing over a million infections worldwide

that often develop antibiotic resistance. Despite this, mechanisms for

regulation of its population-level phenotypes, which are associated

with pathogenesis, have remained elusive. Researchers in Professor

Jason Crawford’s group in the Departments of Chemistry and

Microbial Pathogenesis at Yale University, however, have illuminated a

key pathway underlying this phenomenon at the molecular level. They

have discovered the structure and pathway of autoinducer-3 (AI-3), an

uncharacterized signal responsible for regulating virulence.

This signal is secreted from the bacteria during growth, accumulating

as cells divide and allowing the bacteria to assess their numbers.

Researchers isolated the metabolite by applying cellular stress. Then,

the structure was determined using one- and two-dimensional

Nuclear Magnetic Resonance spectroscopy. Additionally, the effects of

AI-3 were tested on both bacteria and human tissue. Upon introducing

the metabolite to a strain of E. coli that causes intestinal lesions and

kidney failure, the bacteria became more virulent. When introduced

to human tissue, an inflammatory effect was observed, indicating

human cells can detect and combat these signals.

The elucidation of the AI-3 structure and pathway is a crucial step

forward in microbial pathogenesis. “It can be used to determine

the collection of genes regulated by the AI-3 molecule in other

pathogenic bacteria,” Crawford said. These findings pave the way to

combatting virulence in a variety of pathogens. ■

Kim, C.S., Gatsios, A., Cuesta, S., Chong Lam, Y., Wei, Z.,

Chen, H., Russell, R.M., Shine, E.E., Wang, R., Wyche, T.P.,

Piizzi, G., Flavell, R.A., Palm, N.W., Sperandio, V., & Crawford,

J.M. (2020). Characterization of Autoinducer-3 Structure and

Biosynthesis in E. coli. ACS Central Science, 6(2), 2020, 197–206.

https://doi.org/10.1021/acscentsci.9b01076

September 2020 Yale Scientific Magazine 7


NEWS

Biochemistry

THE DUALITY

OF THE

EBOLA VIRUS

How a Deadly Viral Infection

Can Be Harnessed for Healing

BY SYDNEY HIRSCH IMAGE COURTESY OF FLICKR

When we hear “Ebola,” we often think of its contagion and

lethality, and of the outbreaks in recent years. Ironically,

scientists are exploring the potential of the deadly Ebola

virus (EBOV) as a treatment against a fatal form of cancer: brain

tumors. Cancer cells lack the ability to generate an immune response

against viruses, making viruses a good starting point for developing

treatments. Of course, infecting someone with a lethal virus is risky; to

circumvent this, scientists use chimeric viruses, which contain a mix of

genes from multiple parent viruses. A team of researchers, including

Yale professor Anthony Van den Pol, recently reported their efforts to

test three variations of a chimeric virus, pairing an EBOV glycoprotein

with the vesicular stomatitis virus (VSV). They chose the Ebola gene,

given the virus’s propensity to infect—and for their purposes, target—

nerve tissue. Specifically, they took interest in the mucin-like domain

(MLD) of the Ebola virus, and how it modulates the viral ability to

target brain tumors. Interestingly, it seemed as if the MLD protected

normal cells from infection, while cancer cells still became infected.

They were hopeful that VSV-EBOV could be a promising treatment,

as the combination had been an effective and safe vaccine in humans

during the African Ebola outbreak.

The team tested three viruses on severe combined immunodeficient

(SCID) mice, which had human brain tumor cells injected into their

brains: VSV-EBOV, which contains the mucin-like domain (MLD);

VSV-EBOVΔMLD, which is a parallel construct but lacks the mucinlike

domain; and VSV-EBOVΔMLD-GFP, which is almost identical

to VSV-EBOVΔMLD with an added green-fluorescent protein (GFP)

reporter gene to visualize a virus. All three showed some increase in

the mice’s survival. The researchers found that the VSV-EBOV was

most effective in treating the brain tumors while maintaining the

health of the mouse. At 120 days after the tumor implant, only mice

infected with the MLD-containing virus remained alive.

The researchers considered VSV-EBOV successful because it

minimally infected healthy neural cells while still targeting tumor

cells. Van den Pol’s team quantified the extent of the brain infection

by counting the number of infected neurons and glial cells in coronal

brain sections. The other two virus forms showed widespread infection

throughout the brains of the animals. The VSV-EBOVΔMLD-GFP was

the least effective. While it modestly extended the survival of the mice,

all of the mice died. Some were incompletely infected by the virus, and

many still had brain tumors. The VSV-EBOVΔMLD injected tumors

had similar tissue structure, and a greater survival rate.

The lethality of the VSV-EBOVΔMLD-GFP virus may have been

due to the VSV backbone itself; this differs from those of the non-

GFP chimeric viruses in all four of their base proteins, which may

alter the behavior of the virus. Van den Pol explained that the Ebola

virus may release the MLD as a “false leader, causing the immune

system to be lured away from the infected cells.” This slowed

replication of the virus and lessened the amount of infectious viral

offspring. With a slower replication rate, the innate immune system

has more time to upregulate antiviral defenses. The low number of

infected normal cells suggested that the innate immune system was

sufficient to prevent the spread of the virus.

Researchers also compared the effects of intravenous versus

intracranial injection. Both methods had degrees of success.

Intracranial injection showed greater tumor infection and elimination,

indicating this type of delivery may be more reliable for treating larger

tumors. Intravenous injections, which are done through the tail-vein

in mice, could on the whole be more effective for smaller or undetected

types of metastatic cancer, such as melanomas.

Van den Pol and his team were able to monitor the impact of the

MLD on the treatment and survival of SCID mice. The chimeric

virus containing the mucin-like domain, VSV-EBOV, was the most

successful treatment, confirming their initial expectations. This

research is promising, as it could open the door for new forms of

glioblastoma treatment. “VSV-EBOV has been successfully used in

the human population in the past, showing that it’s relatively safe. If

we’re ultimately trying to move toward clinical studies, that’s a hurdle

already jumped over,” Van den Pol said. Future directions include

looking at tumors in immunocompetent mice or exploring other

VSV-based viruses. ■

Zhang, X., Zhang, T., Davis, J.N., Marzi, A., Marchese, A.M., Robek,

M.D., & van den Pol, A.N. (2020). Mucin-Like Domain of Ebola

Virus Glycoprotein Enhances Selective Oncolytic Actions against

Brain Tumors. Journal of Virology, 94(8), e01967-19.

8 Yale Scientific Magazine September 2020 www.yalescientific.org


Molecular Biology

NEWS

MICRO

MOLECULE,

HUGE IMPACT

How microRNAs Inhibit

Asthmatic Reactions

IMAGE COURTESY OF RYAN JEFFS

Despite its name, microRNA helps the body on a macro

scale. The small molecule, only about 22 nucleotides

long, plays an important role in the regulation of

gene expression by binding to mRNA at certain points in its

sequence. At the Yale School of Medicine, Shervin Takyar and

his team investigated the role microRNAs played in controlling

eosinophilia through endothelial cells. Eosinophilia occurs when

large numbers of white blood cells called eosinophils are recruited

to a site in the body, leading to the allergic airway reaction

characteristic of asthma and chronic rhinosinusitis (CRS).

The paper’s first point of analysis was investigating which

inhibitory factors interact with VEGF (vascular endothelial

growth factor). “We looked at a vascular factor [endothelial cells]

and inhibition, investigating their role in asthma and CRS,” Takyar

explained. Takyar’s team found that microRNA-1 levels went down

when VEGF went up. “The next step[s] [were] figuring out first,

whether microRNA-1 levels are important in the disease, and

second, why,” Takyar said. The team was then able to show, through

biological and mathematical models, that the molecule was

important. They also proved that asthmatic eosinophilia reactions

were reduced by increased levels of microRNA-1 in the blood.

Takyar and his team recreated the conditions first in

transgenic mice and in an engineered lentivirus (retroviruses

with long incubation periods) and then created a model for

a human vessel in the lab. “We isolated endothelial cells and

reversed the change [in microRNA levels], and only changing

microRNA-1 levels in these cells [could] decrease the features

of asthma,” Takyar said.

The next step was to understand where the molecule was

acting in the cell. The results revealed that microRNA was

inhibiting the asthmatic reaction by acting in a known protein

complex, the RNA induced silencing complex. This was

especially hard to investigate because microRNA cannot be

tagged, since this would inhibit the molecule from entering

the complex. Within the complex, there is an Argonaute

protein, which acts as a “matchmaker” for microRNA and

the inhibiting molecules; this protein was used to understand

how microRNA was influencing the eosinophilic reaction.

www.yalescientific.org

BY BEATRIZ HORTA

“We captured the Argonaute when microRNA was entering

to see what microRNA-1 was acting on,” Takyar said. With

this strategy, the team identified four genes that control

eosinophilia, an important breakthrough.

After this discovery, the researchers extracted cells from

humans with CRS and created a model for the human

endothelium environment. In this model, they passed

eosinophils over the cells to see how many would stick to them

in varying levels of microRNA. As expected, they found that in

increased levels of microRNA, less eosinophils would adhere

to the cells, which revealed the mechanism of action for the

inhibition of the symptoms.

To the researchers, the discovery shows promise for clinical

treatments. “Right now, we are starting a collaboration with

clinical groups and some companies to use [this mechanism]

as a complementary treatment for some patients who do not

respond to drug treatments,” Takyar said. He explained that

treatment for CRS and asthma is difficult because patients

have different symptoms and mechanisms; therefore, many

drugs do not have any effect on certain individuals. The role

of microRNA in the eosinophilic mechanism is an exciting

development in the area and could represent a step forward to

improve treatments. ■

Korde, A., Ahangari, F., Haslip, M., Zhang, X., Liu, Q., Cohn, L., Gomez,

J.L., Chupp, G., Pober, J.S., Gonzalez, A., Takyar, S.S. (2020). An

endothelial microRNA-1-regulated network controls eosinophil

The Journal of Allergy

and Clinical Immunology, 145(2), 550–62. https://doi.org/10.1016/j.

jaci.2019.10.031

Jeffs, R. (2011). MicroRNA and mRNA visualization in differentiating

C1C12 cells. Retrieved 22 March 2020, from https://commons.

wikimedia.org/wiki/File:MicroRNA_and_mRNA_visualization_in_

differentiating_C1C12_cells.jpg

Lentivirus.aspx

September 2020 Yale Scientific Magazine 9


FOCUS

Nanoscience

Searching through Nanoparticle Libraries

TRACKING

DRUGS

In the wake of the COVID-19 pandemic, clinical trials for

potential drugs and treatments have never seemed more

paramount. Scientists must carefully evaluate a drug’s interactions

within the body and observe possible side effects.

Particularly for drugs that must be administered intravenously,

it is critical for researchers to determine the circulation halflife,

or time it takes for a drug’s concentration to be halved, to

gain insight into how long a drug remains in the body.

One useful technology to visualize and measure concentration

is fluorescence microscopy. The principles of fluorescence

rely on excitation of fluorophore molecules and emission

of light. With fluorescent dyes or probes to track a target

molecule, fluorescence microscopes allow biomedical researchers

to perform experiments in vivo, or directly in the

organism, providing strong spatiotemporal resolution to visualize

physiological systems in normal and diseased states.

This is a primary research focus of the laboratory of Mark

Saltzman, professor of Biomedical Engineering at Yale University.

From generating polymeric nanoparticles that aid

drugs in targeting brain tumors to producing bioadhesive

biodegradable nanoparticles to be used in sunscreen, the

Saltzman research group has aimed to devise safer and more

effective technologies to prevent disease. For the past fifteen

years, the Saltzman laboratory has also generated libraries

of nanoparticles to determine how their chemical composition

may affect the behavior of and interaction with biological

specimens, including impacts to circulation half-life.

“We use nanoparticles to facilitate the delivery of therapeutic

molecules, protecting them from things in the blood like

a protective shell, until they arrive at the target destination.

Depending on the properties of the nanoparticles, you can

deliver a package of molecules all at once or gradually,” said

Laura Bracaglia, a postdoctoral researcher who has worked

on developing these nanoparticle libraries. Evaluating the efficacies

of these nanoparticles relies on an accurate method

of measuring concentration over time, such as via correlating

fluorescence intensity of fluorescently tagged injected agents.

In a recent paper published in PNAS, co-first authors Bracaglia

and postdoctoral colleague Alexandra Piotrowski-Daspit

designed a quantitative microscopy approach to efficiently

measure the circulation half-lives of fluorescently tagged

agents, such as nanoparticles encapsulating fluorescent dye or

PHOTOGRAPH COURTESY OF KATE KELLY

PHOTOGRAPH

COURTESY OF

KATE KELLY

fluorescently labeled antibodies.

BY ANNA SUN

10 Yale Scientific Magazine September 2020 www.yalescientific.org


Nanoscience

FOCUS

Limitations of Traditional Methods

A commonly used protocol for determining

concentration of fluorescently

dyed nanoparticles after administration

involves three steps: collecting at least

twenty microliters of blood from experimental

animals, separating dyed nanoparticles

from the blood samples, and measuring

the dye’s concentration by dissolving

the nanoparticles to create a uniform solution.

The process, however, can be laborious,

expensive, and error-prone.

One of the greatest challenges of the

traditional method is the volume of blood

needed for a plate reader to detect even

trace amounts of fluorescent dye within

the sample. The catch-22 is that removing

too much blood from an experimental

animal can interfere with studying how

injected drugs affect disease outcomes,

since circulating drug molecules can be

removed during blood collection.

Revamped Microscopy

The researchers realized that the plate

reader machine typically used to measure

fluorescent dye concentrations was ineffective.

“You need a uniform amount of

blood in the plate reader, but the measurement

tends to be inaccurate depending

on where in the solution you

are measuring,” Piotrowski-Daspit said.

To address the limitations of using large

blood volumes, the researchers decided

to switch to quantitative microscopy,

which requires only a drop of blood on

a microscope slide. “Depending on the

strength of the microscope, you can see

in the sub-micron level, so you don’t need

that much blood to see everything,” she

said. With their revamped method, only

two microliters of blood, compared to the

twenty microliters needed for the existing

protocol, are needed to accurately measure

circulation half-life.

The concentration of a drug in circulation

decreases exponentially until it approaches

zero, when it has been mostly

eliminated from the body. A drug’s halflife

is a useful measurement in understanding

circulation time, and the goal

of this quantitative microscopy method

is to understand how the drug is transported

and reacts within the body. “You

can make design changes to a molecule

www.yalescientific.org

or drug via physical or chemical methods

to make it less likely to be degraded

or phagocytosed in order to be circulated

for a longer time in the blood,” Bracaglia

said. “Sometimes, it’s also beneficial

for a drug to have extended circulation

to allow more time to reach a target,” Piotrowski-Daspit

added.

Where to Inject?

In their study, the researchers initially

focused on quantifying rodent drug delivery.

Because there are two standard

ways of intravenously injecting drugs to

rodents—retro-orbital (RO, or behind the

eye) and tail-vein (TV, or in the tail) administration—the

researchers tested both

routes of administration to better understand

possible changes in circulation

half-life. “RO is easier for some people,

so we were thinking if one experimenter

injects RO and another does TV, then

does that matter?” Bracaglia explained.

Whereas the previous protocol might

not have had the resolution to accurately

measure differences in half-lives between

RO and TV routes, the researchers detected

subtle differences in nanoparticle

concentrations measured within the first

thirty minutes of blood collection—a testament

to the powerful resolution of their

method. TV injection had higher measured

concentrations, but these concentrations

equalized after one hour. This

initial variability was not too concerning,

because “we’re sampling blood from the

tail, so it makes sense that the TV concentration

was higher at first than the RO,

which needed more time to pass through

circulation,” Piotrowski-Daspit said. Bracaglia

pointed out that detecting changes

in circulating concentration based on the

route of administration may also be relevant

for humans, since drugs are also administered

using various methods.

Expanding the Data

To determine whether this improved

method of measuring fluorescence concentration

could be applied to molecules

of different sizes, the research team also

successfully tested fluorescent antibodies.

“Whereas nanoparticles are usually

sized between 180-250 nm, antibodies are

smaller at around 10 nm. We wanted to

PHOTOGRAPH COURTESY OF KATE KELLY

A photograph of Dr. Piotrowski-Daspit looking

through a microscope, accompanied by Dr.

Laura Bracaglia. These research scientists

see if we can detect a wide range of agents

that might be injected into an animal

model,” Piotrowski-Daspit said. Because

their circulation measurements of these

antibodies matched the decay profiles

gathered from literature, the researchers

were confident that their method could

even detect small antibodies in the blood.

The data from the quantitative microscopy

method can also be combined with

further multivariable analyses. Saltzman

emphasized the importance of observing

biodistributions from these experiments—understanding

what kind

of tissues and what types of cells the

nanoparticles are found in over time. “By

coupling with other methods, you end up

with a powerful high-throughput, comprehensive

look at how long these particles

circulated and where they end up,” he

said. Furthermore, because only a small

amount of blood is needed for each sample,

more data can be collected from a

single experiment and animal. “Using

different nanoparticles each with separate

dyes, you can track these nanoparticles

in one animal. Because this can also

introduce differences in half-life and biodistribution

than when injected alone, it’s

an interesting way to see what happens

when you administer more than one drug

September 2020 Yale Scientific Magazine 11


FOCUS

Nanoscience

at once,” Piotrowski-Daspit explained.

This method could provide researchers

opportunities to better understand combination

therapies in humans as well.

Overcoming Obstacles

The researchers faced a few challenges

on their path to developing this improved

microscopy method. First, a major

concern with nanoparticle research

is the possibility that the fluorescent dye

(which is visualized) and the nanoparticle

itself have unexpectedly separated, so

the dye is no longer indicating where the

nanoparticle is. To address this, the team

ordered a commercially available polymer

that is chemically linked to a fluorescent

dye, and then imaged both the

polymer and a separate encapsulated dye.

“The observation that they colocalized

served as evidence that going forward, if

we look only for the encapsulated dye, we

can be confident that it is also with the

nanoparticle of interest,” Bracaglia said.

Another concern was that measuring fluorescent

agents might not be as accurate

as measuring radiolabeled agents, so the

team carefully compared their experimental

half-lives with examples from literature.

Not only did they confirm similar

half-life values, but their method was

also less complicated and more accessible

for the average lab, which may not have

equipment for measuring radioactivity.

Future Projects

Armed with a more effective method to

measure circulation half-lives of drugs,

the Saltzman research group plans to

ABOUT THE AUTHOR

rapidly screen through their nanoparticle

libraries. “We are excited to see where

these new nanoparticles go and how long

they stay in the blood, and to learn more

about how changes to physical and chemical

properties can affect drug delivery

success,” Bracaglia said.

An upcoming challenge for these researchers

involves what happens after

nanoparticles are delivered into circulation.

Because the liver functions to detoxify

drugs from the blood, nanoparticles

often accumulate in the liver instead

of the desired target organ. The researchers

hope to discover ways to bypass

the liver, using “decoy” nanoparticles.

“These molecules potentially may be

used to pre-treat and take up residence in

the liver, such that anything that comes

afterwards can remain in circulation longer

and reach other organs,” Piotrowski-Daspit

explained.

The main advantage of this novel protocol

is that the improved quantitative

fluorescent microscopy has drastically

reduced sample blood volume. Previous

limitations from sample blood volume

often prevented experiments involving

essential animals with rare tumors or diseases.

“You normally don’t want to waste

these animals doing a half-life experiment.

If you’re treating the tumor, you

want to save these animals to see if the

treatment worked,” Bracaglia said. Drug

circulation, however, might significantly

differ between non-experimental and

diseased animals. With this new timeand-cost

effective, accessible microscopy

method, scientists may soon be able to

screen a wide range of therapeutic agents

and provide more accurate measurements

for preclinical studies, enabling researchers

everywhere to answer the growing

need for innovative drugs. ■

ANNA SUN

ANNA SUN is a senior in Jonathan Edwards College majoring in Molecular, Cellular and Developmental

Biology. She currently serves as Managing Editor for the . Outside of , she

studies riboswitches, volunteers in the hospital, and reads with New Haven youth. She also enjoys

dancing and exploring the food scene in New Haven with her friends.

THE AUTHOR WOULD LIKE TO THANK Laura Bracaglia, Alexandra Piotrowski-Daspit, and Mark

Saltzman for their time and thoughtful discussions about their research.

FURTHER READING

Bracaglia, L. G., Piotrowski-Daspit, A. S., Lin, C., Moscato, Z. M., Wang, W., Tietjen, G. T., & Saltzman, W.

M. (2020). High-throughput quantitative microscopy-based half-life measurements of intravenously

injected agents. PNAS, 117(7), 3502-3508.

Bracaglia, L. G., Piotrowski-Daspit, A. S., & Saltzman, W. M. (Personal interview, March 4, 2020).

FDA. (2018, January 4). Step 3: Clinical research. The Drug Development Process. https://www.fda.gov/

patients/drug-development-process/step-3-clinical-research

Smith, Yolanda. (2018, August 23). News Medical Life Sciences. https://

www.news-medical.net/health/What-is-the-Half-Life-of-a-Drug.aspx

Fluorescence fundamentals.

biological and biomedical research. , 14067-14090.

Saltzman Research Group. (n.d.). Our research. https://saltzmanlab.yale.edu/gallery/our-research

Le, J. (2019, June). Merck Manual Consumer Version. https://www.merckmanuals.

com/home/drugs/administration-and-kinetics-of-drugs/drug-administration

12 Yale Scientific Magazine September 2020 www.yalescientific.org


Computational Biology

FOCUS

DRIVERS

PREDICTING

THE FUTURE

IMAGE COURTESY OF WIKIMEDIA COMMONS

using driver

mutations

to estimate

tumor growth

patterns

BY

MARIA

FERNANDA

PACHECO

Ever since that split second in which

your parents’ gametes fused to

generate your life, the cells that make

up who you are have not stopped

dividing, even right now. As they

replicate, it is likely that their DNA will mutate,

incorporating new traits into the genetic code

that writes their fate—what structure they will

adopt, what function they will perform, what

purpose they will serve. While some mutations

can be responsible for traits like ginger hair

or the absence of wisdom teeth, or even at

times go completely unnoticed, other far more

dangerous ones hold the power to corrupt a

cell’s machinery, culminating in grave ripple

effects that can make all systems go haywire.

www.yalescientific.org

September 2020 Yale Scientific Magazine 13


FOCUS

Computational Biology

How the Model Works

PHOTOGRAPH COURTESY OF WIKIMEDIA COMMONS

Photograph of a driving wheel, symbolizing how some mutations drive tumor progression.

The word “tumor” is laced with

terrifying potential. Possibilities of

anarchical growth, silent spread, and

rapid lethality render these neoplasms’

behavior difficult to predict. In an

effort to circumvent this uncertainty, a

group of Yale researchers led by Mark

Gerstein, Albert L. Williams Professor

of Biomedical Informatics and Professor

of Molecular Biophysics & Biochemistry,

Computer Science, and Statistics & Data

Science, published a paper in February

reporting a mathematical model they

developed. The model looks at a specific

kind of mutation called driver mutations

to estimate a tumor’s growth pattern.

to detect because those are the ones that

actually play a role in tumor progression.”

Conversely, mutations identified as nonsignificant

in terms of tumor development

are dubbed passenger mutations.

According to Gerstein, driver mutations

can be defined as the “few mutations that

accumulate in the cell and drive its growth

forward.” In the paper, the authors discuss

different means through which these

mutations can trigger the formation of

tumors, including hindering the ability of

tumor-suppressor genes from impeding

tumor growth and enhancing the level of

expression of oncogenes, which are genes

that can cause cancer.

When tumors are biopsied, a sample is

often extracted and sequenced to reveal

its genetic composition. According to

Salichos, the number of times a specific

position in the genome is sequenced is

very important. The deeper the sequencing,

the more accurate you can expect the

measurement of a mutation’s frequency

within a population-to-be. At the end of

the process, you have acquired a run-down

that details all of the mutations detected as

well as their respective frequencies, which

paints a clear picture of their expressivity

within the cell’s genetic code.

“Based on the frequency, you can already

make an assessment of whether that

mutation happened early or late in the tumor,

because, if it happened early in the tumor

progression, we are expecting it to have a

higher frequency at the end,” Salichos said.

Therefore, ordering mutations from those

that appear most to least provides insight

into the order in which they occurred. This

information contextualizes what mutations

might have stimulated tumor growth and

which ones occurred as collateral damage,

helping frame their relevance with respect

to tumor progression.

Salichos explained that, based on this

idea, he developed a mathematical model

that uses the frequency of some of the

mutations that happened exactly before

the driver mutation to detect presence of

the driver and estimate tumor growth at

the precise moment when it first emerged.

This examination allowed the group to

gauge the impact of this phenomenon,

since the introduction of a driver mutation

Driver Mutations

The development of cancer is an

evolutionary process, punctuated by

mutations. Historically, several theories

have been put forward as to how researchers

can study such genetic alterations, but, most

recently, mutations have been increasingly

labeled as either drivers or passengers to

categorize them according to their relevance

in tumor progression. Leonidas Salichos,

a postdoctoral associate and first author

of the paper, explained that “we have a lot

of mutations in every tumor, sometimes

thousands of mutations, and a few of them

are what we call drivers, … which we try

With this kind of model, you

can look into an individual’s

tumor in a more direct way.

14 Yale Scientific Magazine September 2020 www.yalescientific.org


Computational Biology

FOCUS

into a sample often creates a detectable

perturbation in the variant allele frequency

distribution. “Once you introduce a

driver into a population that grows, now

the population starts growing faster, and

that has an impact on the frequency of

the mutations that happen before that,”

Salichos said.

Driving Towards More Accurate Cancer

Prognosis

“Traditionally, the way people find these

drivers is they look at cohorts of cancer

patients at the same time,” Gerstein said.

Salichos also highlighted that over a thousand

samples are often needed in traditional

methods, since large numbers are required to

ground observations that something deviates

from the normal. At least computationally,

this is how scientists normally validate a

suspicion that a specific mutation is important

for the development of a tumor.

However, the need to examine a whole

cohort can serve as a limitation in the study

of cancer genomics. If several samples

are required every time physicians want

to understand the role of a driver in a

tumor’s progression within a particular

patient, individualized assessments of how

specific growths will develop become more

complicated to attain. In that regard, this

is where their model adds something new.

“This method doesn’t require a cohort, but

only one tumor to be very deeply sequenced,”

Gerstein said. The approach incorporates

ultra-deep sequencing, a method that

entails the sequencing of the same location

in the genome several times to identify rare

variations, into their analysis. “The novelty

of this method was, instead of looking

into many different samples, we actually

harnessed the frequency of the mutations

based on growth models and analyzed both

the mutations and their frequency in the

population to try to make an assessment of

which of them mattered and which did not,

all within that individual sample,” Salichos

said.

This model could enable scientists to

account for how cancer heterogeneity

results in no two tumors ever being

completely alike. While reliance upon

averages is often important when looking

at growths that behave differently

depending on their genetic make-up,

as well as the context in which they are

www.yalescientific.org

inserted, every tumor—even ones of the

same kind—will behave differently. “With

this kind of model, you can look into

an individual’s tumor in a more direct

way… you don’t have to think about a

cohort or a database very much,” Gerstein

said. Considering how this framework’s

applicability does not require more than a

single tumor, it could lay the foundation

for more specialized evaluations that take

only the characteristics of the studied

tumor into account, making more specific

assessments possible.

Testing the Model’s Efficacy

In order to test the model’s effectiveness,

simulations were run to see if it could,

in fact, predict the presence, time of

occurrence, and effect of a driver mutation.

In addition to testing the algorithmic

function upon which the model relied

by applying it under different growth

models, such as exponential growth and

logistic growth, the group also sought to

demonstrate the framework’s efficacy on

real samples. To that end, the model was

applied to 993 tumors obtained from the

Pan-Cancer Analysis of Whole Genomes

Consortium—an online database that

provides information obtained through

whole genome sequencing and integrative

analysis data of over 2,600 tumors across

thirty-eight diverse types of tumor.

After observing that the identified

drivers were correlated with periods of

positive growth in the samples examined,

ABOUT THE AUTHOR

the group sought to further consolidate

their framework by applying it to a sample

of an Acute Myeloid Leukemia (AML)

tumor. According to Gerstein, this tumor

was chosen due to its history of having

been deeply sequenced in the past. For

AML, the growth patterns they predicted

showed significant similarities with those

exhibited by the tumor.

The promising evidence surrounding the

model’s effectiveness provides reasons to

be optimistic about its future applications.

This novel way to look into tumors could

make a big difference in the future of cancer

treatments. Instead of just relying on broad

data, this could allow doctors to tailor their

evaluations of a patient’s prognosis to what

their specific tumor sample shows. In this

way, this kind of personalized assessment

could herald a new era in cancer genomics. ■

MARIA FERNANDA PACHECO

MARIA FERNANDA PACHECO

YSM

Yale Global Health ReviewYale Daily News

FURTHER READING

Deep Sequencing.

Oncogene

IMAGE COURTESY OF PIXABAY

Three-dimensional illustration of DNA.

September 2020 Yale Scientific Magazine 15


FOCUS

Ecology

THE ROLE OF

IN PLANT

How does soil organic matter help crop growth?

BY CINDY KUANG

Death, SOM, and Soil

Death often stays in the soil. Over time,

organisms and residues in varying

states of decomposition form a vital

component of the soil: soil organic matter

(SOM). SOM has always been thought of as

an indicator of soil fertility, contributing to

healthier soil and better crop growth. Thus,

building SOM, or raising its levels through the

addition of compost or manure, is assumed to

be a cost-effective way of reducing reliance

on external inputs such as fertilization and

irrigation. But how well are the effects of

SOM actually understood? In various studies,

higher SOM has been shown to correlate with

both higher and lower productivity, so until

now, the effects of added SOM on soil fertility

are inconsistent and unclear.

The question is further complicated by

the possibility that this causative pathway is

bidirectional: Does SOM lead to increased

crop productivity, or do increased plant

inputs lead to higher SOM levels? In order

to figure out this relationship between SOM,

agricultural inputs, and productivity, we

need to be able to isolate and investigate

SOM’s effect on plant growth. Emily Oldfield,

a postdoctoral fellow at the Yale School of

Forestry and Environmental Studies, works

in the Bradford lab to study SOM’s effects. “A

lot of the policies that are being put forth by

organizations like the USDA, the Food and

Agriculture organization rest on the premise

of the more organic matter, the better, but

there’s really no hard quantification of how

much more and how much better,” Oldfield

said. “The goal of my research was to try to

put some numbers behind it.” She described

this greenhouse study as an effort to use

a controlled environment to establish a

causative pathway between SOM and crop

productivity.

What is soil?

Soil itself is a complex mixture of elements,

consisting of around forty-five percent

minerals, (including sand, silt, clay) and fifty

percent air and water. In particular, plants

require nitrogen more than any other nutrient,

but they can only take up mineral forms of it,

including nitrate and ammonia, which only

make up two percent of the nitrogen in soil.

The other ninety-eight percent of nitrogen is

organic and inaccessible to plants, meaning

many farmers rely on mineral N-fertilizer to

facilitate crop growth.

The remaining five percent of soil

composition is soil organic matter—anything

that was once living. Though SOM is a very

small percentage by volume, its influence is

disproportionately large. SOM dictates the

structure of the soil, increasing aeration and

water-holding capacity, and acts as a habitat

for other soil organisms. SOM also powers

the cycling, retention and release of various

nutrients essential to productivity.

The Experimental Setup

Oldfield was determined to quantify the

effect that SOM, fertilization, and irrigation

have on crop productivity, both used

separately and in tandem. She designed an

experiment with four target levels of SOM

(1%, 2.5%, 5.5%, 8.5%) crossed with two

different fertilization treatments (none

versus 100 kg N/ha as urea) and further

crossed with two irrigation treatments

(optimum versus half of optimum). To

verify reliability of results, each treatment

was replicated 10 times—for a grand total of

160 experimental pots.

A dilution approach of organic-rich A

horizon soil (obtained from the Yale Farm in

New Haven, Connecticut) was used to create

these varying SOM levels. By mixing the soil

with an external mineral component (sand

and clay) in different ratios, Oldfield was able

to create a wide gradient of organic matter

concentrations without having to artificially

manipulate SOM (which can lead to other

experimental issues).

This greenhouse experiment was

conducted from May to July, and automatic

ventilation ensured that the pots of

spring wheat (Triticum aestivum, L.), the

experimental crop, never exceeded a daily

temperature of 30 degrees Celsius. A drip

irrigation system was calibrated to emit 0.25

gallons to each pot per hour, though this was

later modified to create different treatments

for various pots. “Optimum irrigation” was

determined to be around 127.2 mL of water

each day and “suboptimum irrigation” was

63.6 mL. At the end of the growing period,

all plants were cut at soil level at the same

time, dried at 65 degrees Celsius and then

weighed in aboveground biomass. Soils were

then passed through a sieve and measured

in terms of SOM content, water-retaining

capacity, pH, microbial biomass, and rates of

net mineralization and nitrification.

16 Yale Scientific Magazine September 2020 www.yalescientific.org


Ecology

FOCUS

Results

To analyze the effect of SOM on growth,

the researchers used a statistical method

called regression to quantify the impact of

each measured variable on plant growth. The

regression models showed that aboveground

plant growth increased as SOM levels

increased until a threshold concentration

of around five percent, after which wheat

biomass began to decline. For soils with

optimum irrigation, this decline started

occurring at around six percent SOM.

Across all SOM concentrations, the

biggest difference in aboveground biomass

was observed between the two experimental

extremes: the pots with optimum fertilizer

and irrigation versus the pots with no

fertilizer and half irrigation. However, this

difference was largest at the lowest one

percent SOM concentration (pots with

optimum treatment produced 3.45 times

more aboveground biomass) and became less

dramatic when SOM levels were at or greater

than give percent (optimum pots produced

1.6 times more biomass). This supports the

hypothesis that SOM contribution can, in

some cases, compensate for plants that are

not receiving any supplemental input and

substitute in for mineral N fertilizer. But this

raises more questions of cost and reward—

will productivity of mineral fertilized soils

always outpace that of soils sustained by

organic matter alone? And what about

the reverse hypothesis: can added mineral

nitrogen fertilizer easily compensate for

lower SOM levels?

Nitrification

Though SOM levels did not seem to

exhibit a strong correlative relationship

with net rates of nitrification, they did have

an impact on net rates of N mineralization,

the process by which organic nitrogen is

converted to plant accessible inorganic

forms. As SOM levels increased, rates of

N mineralization increased. This effect

was greater in fertilized soils compared

to unfertilized soils. However, after SOM

concentrations passed a specific threshold

(around seven percent), pots with optimum

treatment began experiencing decreases in

net rates of nitrification: the plants had less

nitrogen accessible to them at eight percent

SOM as opposed to five percent SOM.

Oldfield hypothesizes that this eventual

decrease in nitrification rate may be related

www.yalescientific.org

to increased microbial biomass that is

correlated with higher SOM concentrations.

These microbes themselves need to draw

upon specific nutrients in the soil, including

nitrogen, phosphorous and sulfur, which

may lead to a competitive environment

for nutrients and oxygen in the soil. In

such an environment, less resources

are available for plant use, which could

explain why productivity began to decline

instead of leveling off at the highest SOM

concentrations. “However, it’s very hard

to get a holistic picture of the forms of

nitrogen. A follow up study would be almost

the exact same experimental setup, just

with different levels of nitrogen fertilizer,”

Oldfield said. This would help determine if

these nutritional elements become limiting

at high levels of SOM.

Final Conclusions

Returning to the original question,

can soil organic matter substitute for

agricultural inputs such as insufficient

fertilization and irrigation? These results,

obtained by the systematic variation of

variables, demonstrate an optimistic

answer: building up SOM levels in soil will

have beneficial impacts on productivity.

Though it may not be a perfect replacement

for N fertilizer, SOM can still help cut back

on costly fertilizer inputs without risking

a lowered yield. “We know through other

research that’s being done right now that

agricultural soils tend to have very low

organic matter concentrations as a result of

tillage and other conventional practices...

You rarely see farm soil that is nine percent

organic matter,” said Oldfield when asked

whether the SOM threshold of five percent

would pose a problem.

Some scientists and agriculturists

continue to argue that though productivity

may increase with higher SOM

concentrations, these benefits will never

outpace or outweigh those brought about

by additional mineral fertilizer. However,

this perspective fails to take into account

the cost and availability of fertilizer. “There

are potential outcomes that don’t directly

translate to yield but are enhancements

in other environmental outcomes that we

do care about. This could be mitigating

agricultural runoff to improve water

quality, improving biological activity of

microbial communities, and enhancing

carbon sequestration,” Oldfield said.

What’s next?

Given that many groups such as the

USDA and policy makers rely on the

general notion that “more is better” when

it pertains to SOM levels in soil, Oldfield

is determined to continue delving into the

nuances and intricacies of organic matter

in soil. She briefly explains how increasing

organic matter could pose drawbacks:

increased SOM concentrations are related

to increases in nitrous oxide emissions, a

very potent greenhouse gas. “I’m interested

in linking [this research] to other outcomes

besides yield,” she says. Her ultimate

research goal is to run this experiment on

a much larger scale and get the “full farm

look,” so she can not only measure crop

growth, but also bigger profitability issues

such as balancing yield against costs and

observing ecosystem outcomes. ■

ART BY ANASTHASIA SHILOV

CINDY KUANG

CINDY KUANG is a first-year prospective Neuroscience major in Timothy Dwight College. In

addition to writing for YSM, she also participates in Danceworks and the Chinese American

Students Association.

for threshold effects of soil organic matter on crop growth. Ecological Applications

Soil Use and

Management

of temperate regions: a review. Soil & Tillage Research

September 2020 Yale Scientific Magazine 17


FOCUS

Physics

THE PROMISE OF TWO-COPPER-PAIR TUNNELING

BY SHOUMIK CHOWDHURY

Every day, hundreds of Yale students

take classes in Davies Auditorium and

work in the Center for Engineering,

Innovation, and Design. Likely only several

will know that just a few stories above them—

on the 4th floor of Becton Center—reside

some of the world’s most powerful quantum

computers. These devices are housed in

the Yale Quantronics Laboratory—Qulab

for short—and are operated by cooling

superconducting circuits in microwave

cavities down to millikelvin temperatures, at

which point their behavior is aptly described

by the laws of quantum mechanics.

First proposed in the 1960s by physicist

Richard Feynman, using quantum

mechanical systems (such as atoms) for

computation is not a new concept. Much of

the progress in experimentally implementing

these devices, however, has come in the last

twenty years, with superconducting circuits

(behaving as artificial atoms) emerging as a

leading platform for quantum information

processing. Unfortunately, quantum bits,

or qubits, built from these circuits are still

highly sensitive to various types of noise

from the environment. This has driven

widespread effort in the field to build better

qubits. Recently, a team of researchers at

Qulab—led by principal investigator Michel

Devoret and graduate student Clarke

Smith—designed a new type of protected

superconducting qubit that is robust at the

hardware-level against several different

noise channels.

Notions of Quantum Computing

Quantum computers are based on a

fundamentally different set of rules than

so-called classical computers—a broad

label characterizing most devices in use

today. Classical data are stored in bits,

and a single binary digit can take on two

logical values: 0 or 1. In practice, this could

be realized by the passage of current, and

the lack thereof, through a wire, or by the

magnetization state of a small region of a

hard drive. At the lowest level, under many

layers of abstraction, all classical algorithms

and operations reduce to manipulating

some pattern of bit strings from an input

state to an output state. The key takeaway

here is that bits take on definite values.

In contrast, quantum computers encode

information in the quantum states of

a system—for instance, in the states

representing the lowest two energy levels

of an atom. These two states, which we

can abstractly label as |0> and |1>, form

what is known as qubit subspace, and by

sending appropriate pulses of light to the

atom, one can perform logical operations

on the qubit. The key difference from the

classical model, however, is that we can also

form admixtures of the two states, called

superpositions, of the form α|0> + β|1>. The

outcomes of measuring such superposition

states are determined by rules of probability,

giving either |0> and |1> with probabilities

|α|2 and |β|2 respectively. While this may

seem counterintuitive, it turns out that

several classes of problems are very wellsuited

to a quantum computer that does

not have definite 0 or 1 bits. Some notable

examples include cryptography and prime

number factorization, optimization and

machine learning, and simulating quantum

mechanical systems (such as molecules) for

applications in fundamental physics and

chemistry. However, the aforementioned

sensitivity of quantum information means

that quantum computers are also more

susceptible to noise and other errors that

arise from coupling to the environment. Any

spurious interaction can lead to unwanted

changes to the desired quantum state, and

thus introduces errors into a calculation.

The fragility of quantum information has

led to what Michel Devoret describes as a

two-pronged effort in the field. “The first

approach is to discover a better method

for quantum error correction … while the

second [approach] is to design physical

qubits with better lifetimes and faster

gate operations,” Devoret said. Research

into quantum error correction (QEC)

involves finding ways to encode a logical

bit of quantum information across many

physical qubits—the benefit is then that

the information becomes more robust

to noise, being distributed non-locally

across the system. However, these kinds of

QEC protocols are often very theoretical

in nature, and the authors of the present

study chose to focus on the more tractable

18 Yale Scientific Magazine September 2020 www.yalescientific.org


second approach: engineering qubits

to have properties that make them less

susceptible to noise.

Building Better Qubits

Ultimately, the researchers would develop

a novel type of superconducting qubit better

protected against noise, which Devoret

and coworkers recently reported. The

new design is based on a proposed circuit

element that allows only pairs of Cooperpairs

of electrons to tunnel across the circuit.

“It is an elaboration on the transmon and

fluxonium qubits that we had previously

worked on,” noted Devoret.

The idea of building qubits from

superconducting circuits was first proposed

in 1997; progress in the field followed

rapidly. In such quantum electromagnetic

circuits, charge carriers are pairs of bound

electrons—known as Cooper pairs—which

may quantum mechanically tunnel through

a junction. The quantum mechanical states

of the circuit can be labelled by the number

N of Cooper pairs that tunnel. Although

these circuits are macroscopic objects—

made up of many millions of electrons and

atoms—the number of effective degrees

of freedom is quite small. This gives

superconducting qubits a relatively simple

energy spectrum and is why these systems

are often referred to as artificial atoms.

The transmon qubit is one such type of

superconducting qubit, and it is the qubit

of choice for many of the commercial

players in the field of quantum information,

including IBM, Rigetti, and Google. It

consists of a nonlinear inductance—the

Josephson junction, marked by a crossed

box—in parallel with a capacitor, where

the charging energy of the circuit is much

smaller than the so-called tunnelling energy.

The transmon has a oscillating potential

energy U = EJ cos(φ), where φ is the

superconducting phase in the circuit, and E J

is the tunneling energy for the Cooper pairs

across the junction. “The Josephson junction

is very precious [in superconducting

quantum computing] because it is the

only non-dissipative [lossless] nonlinear

element we have,” explained Xu Xiao, one

of the researchers on the project. “The

cosine potential is therefore the only kind of

nonlinearity we usually have access to,” Xiao

continued. This nonlinearity—stemming

from the cosine potential of the Josephson

junction—is necessary to ensure that the

(frequency) level-spacing is unequal; this

www.yalescientific.org

makes it possible to address only the lowest

two-levels as a qubit, without exciting the

higher energy level states.

The proposed “two-Cooper-pair” qubit

includes a novel circuit element which is

itself composed of two Josephson junctions

in a loop. The current through this loop is

controlled by an external magnetic field

or flux, which, when tuned carefully, gives

rise to an effective potential energy term

of the form U = E J

cos(2φ), i.e., it now has

two energy wells, rather than one. Thus, by

connecting several Josephson junctions, it

is possible to engineer an effective potential

that would not otherwise have been realizable

using a single transmon qubit alone.

The “cos(2φ)” potential reflects the feature

that only pairs of Cooper-pair electrons can

tunnel across the circuit element at a time.

It follows that the number N of Cooperpairs

that have tunneled must have constant

parity (i.e., be even or odd), leading to two

different ground states (of equal energy but

opposite parity). These degenerate ground

states—of equal energy—can be used to

store quantum information in a way that

is resistant towards noise. Like discussed

earlier, this system protects its quantum

information by distributing it across more

than one state. “In experiments, there are

various noise channels, which couple to

the system via some operator. Because of

the special parity of this new potential,

transitions via many such noise channels

between the logical zero and one states are

prohibited,” Xiao said. The researchers at

Qulab tested this new design in simulations

to show that the characteristic lifetimes of

the “two-Cooper-pair” qubit are competitive

with other state-of-the-art implementations,

at around one millisecond. “This [result]

shows how much we can gain … it seems

like if we build more complex circuits, we

could go a long way towards building better

qubits,” Devoret said.

Outlook and Implications

“I think the field as a whole is realizing

how to exploit the specific features of a

quantum system to store information,” said

Xiao, when reflecting on the significance

of their result. “This work is theoretically

quite a successful tactic in understanding

how we go from a circuit design to a

desired Hamiltonian [a mathematical

description of a quantum system], and

then to understanding why it is robust,”

he continued. “Even though this may not

be the ultimate qubit used in a generic

quantum computer, it still educates us a lot

about what types of resources we have.”

Engineering qubits with better coherence

properties can be thought of as a passive form

of error correction, in contrast to the explicit

active quantum error correction protocols

described earlier. “Actually, both methods

are needed,” Devoret said, referring to two

other articles from Qulab to be published

soon. These both try to implement an active

QEC, via two other types of qubit: the Kerr-

Cat qubit (that encodes information in

the phase space of a harmonic oscillator)

and the bridge-state qubit. “This research

takes place on various fronts; you get here

an example of a concerted effort [in our

group] to improve quantum information

science,” Devoret continued. Indeed, as the

field continues to progress, each of these

developments will be crucial steps towards

ultimately realizing a scalable and faulttolerant

quantum computing architecture—

an idea which, unlike in decades prior, now

seems within reach. ■

A R T B Y E L L I E G A B R I E L

ABOUT THE AUTHOR

Physics

SHOUMIK CHOWDHURY

SHOUMIK CHOWDHURY is a junior in Saybrook College studying Mathematics

and Physics. In addition to writing for YSM, he works on research at the Yale

Quantum Institute and Yale Quantronics Lab and is also co-president of the

Society of Physics Students at Yale.

THE AUTHOR WOULD LIKE TO THANK Professor Michel Devoret and Xu Xiao for

their time and enthusiasm for talking about their research.

FURTHER READING

Smith, W.C., Kou, A., Xiao, X., Vool, U., & Devoret, M.H. (2020). Superconducting circuit

protected by two-Cooper-pair tunneling. npj Quantum Inf 6(8). https://doi.org/10.1038/

s41534-019-0231-2

FOCUS

September 2020 Yale Scientific Magazine 19


FEATURE

Research Culture

SHOW AND TELL

SELF-PROMOTION INFLUENCES THE REACH OF SCIENTIFIC DISCOVERIES

BY EVA SYTH

ILLUSTRATION COURTESY OF ANMEI LITTLE

Gender gaps in society continue to persist in everything

from wages to doctoral appointments. Indeed, recent

research from Harvard Medical School and Yale School

of Management suggests a new addition to the chasm: men and

women may differ in how positively they present their research.

University of Mannheim assistant professor and Yale School

of Management (SOM) research fellow Marc J. Lerchenmueller,

Yale SOM professor Olav Sorenson, and Harvard Medical School

professor Anupam B. Jena explored this topic more in-depth by

looking at a number of clinical research and life science articles

published between 2002 and 2017. They then used an algorithm

called “Genderize” to determine the genders of the articles’ first

and last authors, positions which often hold special significance in

life science papers. Lerchenmueller explained that the Genderize

algorithm, which assigns gender to first names based on a database,

was chosen for its accuracy in gender assignment against a US

government-data control sample. The researchers then analyzed

the frequency of 25 words often used in life science articles, which

prior research identified as “distinctly positive,” such as “novel” or

“remarkable”, in the selected articles. The data showed that papers

with both female first and last authors presented their research

positively 12% less than articles with at least one male first or last

author. Additionally, the study found that downstream citations,

which is when a research article is cited in a future article, are

linked with positive terms (9.4% greater citations).

This begs the question: what are the implications of this

research on the greater scientific community? According to

Lerchenmueller, there are two main ones. First, if we observe this

gender difference, there could be multiple reasons. For example,

this study examined published products, not the work originally

submitted to the journals. “Women [may] originally submit

with positive language, but it gets edited out,” Lerchenmueller

explains. Another possibility he proposes is that women may

innately self-screen their writing and not include these terms.

Erin Hengel, a lecturer at the University of Liverpool explored

the question of whether the peer-review editorial process

opined women specifically in the economics field. Hengel found

that in economics, female researchers write 7% more “clearly”

(referring to simple sentence structure and overall readability)

due to higher writing standards for women during peer review.

Lerchenmueller hopes similar work gets done in the life sciences,

pointing out that nowadays, people more often put their papers

up online before they are published in a journal. According to

Lerchenmueller, this could be a good place for mining papers

before the editorial process begins.

The second implication applies more broadly to papers authored

by all genders. According to Lerchenmueller, in science today

versus past years and decades, “We generally observe an increase

in [positive] adjectives to describe research regardless of gender.”

In the most influential journals, the use of positive adjectives

was up by over 80% comparing 2017 to 2002. This represents

a different kind of potential danger. If more and more authors

are promoting their research findings as “novel” and “unique”

without actually supporting these claims in the body of the article,

a new issue develops that calls the scientific community itself into

question. “Science is depicted as a voice of reason in the world

of fake news,” Lerchenmuller explains. If science is beginning to

tread a similar path with these exaggerated claims, it may lose

some of its credibility as an objective source. This increase in

positive research promotion could be happening for a number

of reasons. In science today, there is an exploding amount of

papers being published, and authors are thus motivated to draw

attention to their work in order to have it stand out in the sea

of new research. It is also reasonable to imagine that scientific

journals don’t discourage this type of promotion. After all, editors

need to include high-profile articles to be high-profile journals.

In order to combat this trend, the scientific community may need

to make a concerted effort towards encouraging reasonable claims

surrounding research findings. In the meantime, the gender gap

with regard to scientific publications continues to persist. ■

Hengel, E. (2017). Publishing while Female. Are women held to higher

standards? Evidence from peer review. Cambridge Working Papers in

Economics CWPE1753. https://doi.org/10.17863/CAM.17547

Lerchenmueller, M. J., Sorenson, O., & Jena, A. P. (2019). Gender differences

in how scientists present the importance of their research: observational

study. BMJ, 2019(367). https://doi.org/10.1136/bmj.l6573

20 Yale Scientific Magazine September 2020 www.yalescientific.org


Agricultural Science

FEATURE

DOGS ON DUTY

CANINE DETECTION OF PLANT PATHOGENS BY MAKAYLA CONLEY

As technological advancement allows the world to become

increasingly connected through trade and travel, exotic

pathogens spread more easily across the globe. These

pathogens are not limited to human disease but include plant

pathogens as well. According to Tim Gottwald, the lead researcher

of the Pathology Department at the US Department of Agriculture,

exotic pathogens are especially dangerous to plant populations.

Gottwald explained that most plant species and their pathogens

“develop together evolutionarily, whereas citrus developed in

the absence of its most devastating exotic pathogen, a bacterium

Liberibacter asiaticus (CLas).” Rather, CLas was

introduced to citrus about one hundred years ago by an insect vector

most likely originating in the southeast of Asia near India. Because of

this, citrus lacks a natural resistance to CLas.

In recent years, Gottwald’s lab has studied diseases that plague the

citrus industry in the United States, specifically the CLas bacteria.

Gottwald explained that insects act as “little hyperdermic needles”

and spread the CLas bacterium from tree to tree. Infected insects

with CLas on their proboscis and in their gut deposit some bacteria

into plant cells when they feed on a tree’s phloem. In order to stop a

widespread epidemic, it is important to catch a tree in the early stages

of infection before the bacteria can spread to the rest of an orchard.

A recent paper published by Gottwald’s team reported that dogs could

be trained to smell a CLas infection with near-perfect identification of

infected trees when surveying orchards. In the first year of the study,

the team trained ten dogs to smell citrus trees and sit next to trees

they identified as infected with CLas. To test the dogs’ accuracy, one

hundred tree test grids were set up with infected trees placed randomly

throughout. The dogs were taken through the ten grids in the same

manner they would survey a commercial orchard. Each dog had nearly

a perfect hit rate and identified infected trees with over ninety-nine

percent accuracy over the one thousand trees tested by each dog.

This research was prompted by a great need for new methods of early

detection of plant pathogens. Gottwald’s research group

began to study the use of canine olfaction as a more

IMAGE COURTESY OF

PIXABAY

effective and accurate early detection method back

in 1998. At that time, citrus canker was an exotic

plant disease that was causing an epidemic in

fruit trees. Following a suggestion by one

of his colleagues, Gottwald’s team

explored the use of dogs as a

viable

method for early detection of the disease, but their research came

to an abrupt halt in 2001. After 9/11, canine detection research was

diverted away from the agricultural business and instead focused

on detection of explosives. It wasn’t until 2005 that funding became

available, and the Pathology Department could once again study the

promising field of canine detection.

Before the use of canine olfaction, farmers previously relied on

human visual detection and PCR confirmation to determine if a

tree was infected with the CLas bacterium. However, each of these

methods presented severe shortcomings. Visual detection consists of

a trained surveyor walking through an orchard and looking for host

responses to the infection, i.e. symptoms of the disease. There are two

main challenges to visual detection: latency and absence of visual clues.

“The latency in symptom development can be anywhere from months

to years after an infection takes place,” Gottwald said. By the time a

surveyor observes an infected leaf, the tree could have already served

as a reservoir of bacteria and spread the infection to surrounding

trees. Furthermore, even if a leaf does display symptoms, it is often

difficult for a person to see them. The orientation and location of a

symptomatic leaf on a tree can lead to missed infections. On the other

hand, PCR confirmation is a molecular assay run on tissue samples

from leaves that indicate whether the tissue is infected (i.e. has CLas

DNA) or not. “PCR is almost a perfect assay. If you have infected

tissue, [it will test positive],” Gottwald said. However, PCR presents

a sampling problem: There are a tremendous number of leaves on a

mature tree, and early in the infection, only a few leaves are infected.

In more advanced infections, the CLas bacterial infection may still be

confined to sectors in a tree, so not every leaf will be infected. Even

within a leaf that is infected, not every cell will contain the bacteria.

Canine olfaction solves many of the limitations faced by these

other methods. Dogs can detect very early infections in only a few

leaves anywhere in a tree. Thus, the latency problem associated with

visual signs of infection is eliminated, allowing for earlier diagnosis.

Canine detection is much more accurate than a human performing a

visual search of tree leaves. There is no reliance on a molecular assay

performed only on a few leaves, so canine detection also sidesteps the

sampling problem of PCR. Overall, the use of canine olfaction for the

detection of plant pathogens is a highly effective and accurate method

that does not suffer from the limitations present in conventional

detection methods. This new technique is already being implemented

in orchards across the country, saving the lives of countless trees and

livelihoods of many farmers. ■

Gottwald, T., Poole, G., McCollum, T., Hall, D., Hartung, J., Bai, J.,

... Schneider, W. (2020). Canine olfactory detection of a vectored

phytobacterial pathogen, Liberibacter asiaticus, and integration

with disease control. PNAS, 117(7). https://doi.org/10.1073/

pnas.1914296117

September 2020 Yale Scientific Magazine 21


FEATURE

Computer Science

AN ALGORITHMIC

JURY

PREDICTING

RECIDIVISM

RATES WITH

ARTIFICIAL

INTELLIGENCE

BY MIRILLA ZHU

Over the last two decades,

predictive risk assessment tools

have been used to determine

the fates of millions of individuals in the

criminal justice system, deciding whether

a defendant will be detained or released

based on an algorithmic calculation of

risk. This technology has been embraced

by courts and policymakers alike, with one

Congressional bill going as far as to call

for the implementation of risk assessment

systems in every federal prison. But in

2018, researchers Julia Dressel and Hany

Farid published a surprising result: a

commonly used risk assessment tool

named COMPAS was incorrect almost

half the time. With the accuracy rate of

COMPAS only a few percentage points

higher than that of humans with no

judicial experience, some judges were left

wondering whether they would be better

off not using algorithms at all.

When Stanford graduate student

Zhiyuan Lin heard about Dressel and

Farid’s study, he was equally surprised at its

findings—although for a different reason

than the public. As a computer scientist

in Stanford’s Computational Policy Lab,

Lin had encountered dozens of studies

demonstrating that algorithms performed

better than humans, and he was puzzled

why Dressel and Farid had found otherwise.

Together with a team of researchers from

Stanford and Berkeley, Lin decided to see

whether he could fill in the missing pieces

to understand what was going on.

Lin and his colleagues began by

attempting to replicate the 2018 study,

giving over six hundred participants

the same set of profiles that Dressel and

Farid used and asking them to predict

whether the defendants would recidivate.

When they provided participants with

immediate feedback after each response,

they found that the participants guessed

correctly sixty-four percent of the time,

compared to the sixty-two percent

accuracy rate reported in the 2018 study.

The COMPAS algorithm’s accuracy rate

of sixty-five percent matched the 2018

study exactly.

Next, the researchers investigated

whether these results would hold if they

modified the experiment to resemble the

real world more closely. They did so in

three ways: providing the respondents

with more detailed criminal profiles,

lowering the average recidivism rates

to reflect the rate of violent crime,

and most significantly, not telling the

respondents whether they were right

or wrong. “Receiving this kind of

immediate feedback is something that

rarely happens in reality, because when

the judges are making bail decisions,

they don’t find out whether a defendant

22 Yale Scientific Magazine September 2020 www.yalescientific.org


Computer Science

FEATURE

will recidivate until two years later,” Lin

said. “More often than not, they don’t

see the outcome at all.”

Under these new conditions, the

algorithm performed substantially

better than humans. This accuracy gap

was especially pronounced in the case

of violent crime, for which the study

participants consistently overestimated

the risk of recidivism.

When feedback was

present, the participants

adjusted

their

predictions to reflect

the lower recidivism

rate, but when they

didn’t receive feedback,

they continued to

guess incorrectly forty

percent of the time.

In comparison, the

algorithm was correct

eighty-nine percent of

the time. Lin noted that

this percentage may have

been skewed by the low

recidivism rates, since

a simple algorithm that

guessed “no” each time

could have achieved the

same score. But even

under a different measure that accounted

for variations in the base recidivism rates,

the algorithm still performed better than

humans by achieving sixty-seven percent

accuracy. The researchers were able to

replicate their results with various risk

assessment tools including their own

statistical model, suggesting that these

improvements in performance were not

unique to the COMPAS algorithm.

For researchers like Dressel, however,

Lin’s findings emphasize just how limited

algorithms can be. Accuracy rates under

seventy percent are still “really low,” she

said, given that “the consequences of

making mistakes is so high.” Dressel also

expressed concerns about racial bias,

citing a 2016 ProPublica study which

found that COMPAS predicted false

positives for black defendants at almost

twice the rate of white defendants.

“A fundamental principle of machine

learning is that the future will look like

the past, so it’s not surprising that the

predictions being made are reinforcing

inequalities in the system,” she said.

“A FUNDAMENTAL

PRINCIPLE OF

MACHINE LEARNING

IS THAT THE FUTURE

WILL LOOK LIKE THE

PAST, SO IT'S NOT

SURPRISING THAT

THE PREDICTIONS

BEING MADE ARE

REINFORCING

INEQUALITIES IN

THE SYSTEM.”

Lin acknowledged the shortcomings

of algorithms, but he said that humans

exhibit bias too—and that the biases now

embedded in algorithms initially arose

from humans themselves. Since people

often make decisions in an inconsistent

manner, even imperfect algorithms

could inject a degree of objectivity into

an arbitrary criminal justice system.

Lin emphasized that

these algorithms should

only be used for their

intended purpose—

risk assessment—and

that judges should

consider other factors

when making their final

decision. “There’s this

dichotomy of whether

we should rely only

on humans or only on

artificial intelligence,

and that’s not really how

things work around here,”

Lin said. “Algorithms

should be complementary

tools that help people

make better decisions.”

In order to ensure that

algorithms are being

used correctly, Lin

believes that policymakers must be aware

of how they work. With its black-box

formulas that are protected as intellectual

property, the COMPAS software has not

been conducive to fostering this kind

of understanding. However, developing

transparent and interpretable algorithms

is very much possible. In another

study, to demonstrate accessibility

without compromising accuracy, Lin

created an algorithm with an eight-step

checklist that can be scored manually by

prosecutors to track exactly how risk can

be calculated. The checklist is simpler

than many traditional machine learning

models, yet it performs just as well in

real-life situations.

But given that neither algorithms

nor humans are perfect predictors of

recidivism, Dressel suggests that our focus

should not be on developing better tools,

but rather reducing our reliance on them.

Enacted this January, the New York bail

reform law is an instance where the role

of risk assessment has become essentially

obsolete—all pretrial detainees arrested

for nonviolent crimes are allowed to go

free without posting bail, regardless of

perceived risk. According to a report

by the Center for Court Innovation, the

reform could decrease the number of

pretrial detainees by forty-three percent,

which is especially significant given that

eighty-five percent of them are Hispanic or

black—by no coincidence the same races

overrepresented in algorithmic predictions

of high-risk individuals. “I think what

New York did is great,” Dressel said. “The

decisions we’re making in a pretrial context

shouldn’t be based on someone’s risk. We

shouldn’t sacrifice anyone’s liberty until

they’ve had a fair trial.”

Still, many researchers believe there’s a

place for algorithms within the criminal

justice system. “It’s a bit premature to be

using these kinds of algorithms now, but

I think we will be seeing more of them

in the future,” said Nisheeth Vishnoi, a

computer science professor and founder of

the Computation and Society Initiative at

Yale. “It’s good that people are scrutinizing

them, because what that is doing is creating

new dialogue around these issues.” A

proper application of machine learning

algorithms, he says, will require learning

in all directions—from policymakers,

scientists, and each other. ■

A R T B Y E L L I E G A B R I E L

Dressel, J. & Farid, H. (2018). The accuracy, fairness, and limits of predicting recidivism. Science

Advances, 4(1). https://doi.org/10.1126/sciadv.aao5580

Lin, Z., Jung, J., Goel, S., & Skeem, J. (2020). The limits of human predictions of recidivism.

Science Advances, 6(7). https://doi.org/10.1126/sciadv.aaz0652

Lin, Z., Chohlas-Wood, A., & Goel, S. (2019). Guiding prosecutorial decisions with an interpretable

statistical model. In AAAI/ACM Conference on AI, Ethics, and Society (AIES ’19). https://doi.

org/10.1145/ 3306618.3314235

www.yalescientific.org

September 2020 Yale Scientific Magazine 23


FEATURE

Genomics / Clinical Research

LETTING

EXPERIENCE

GUIDE THE WAY

The ultimate goal of clinical and

translational research is to leverage

scientific discovery and innovation

to drive improved treatments and patient

outcomes. In light of this goal, it seems

reasonable that patients and the public

should be involved in the research process.

Yet, there is a disconnect between researchers

and the public, even in translational settings.

There is increasing awareness of the

necessity of patient and public involvement

in both clinical and preclinical research.

There are several disadvantages at play

when these partnerships are absent.

For one, scientists often lack personal

experience with the diseases they study.

As a result, they may not be attuned to the

most pressing treatment needs in disease

communities, potentially limiting their

ability to translate knowledge production

into knowledge use. In other words, lack of

patient input on the research process can

result in research waste––in which scientific

communities produce research findings

that have minimal real-world application.

Moreover, given that a great deal of research

is publicly funded, scientists have a duty to

be accountable and transparent with the

public. This is greatly facilitated by public

involvement in research, which therefore

has inherent democratizing value.

What is patient and public involvement?

The patient and public involvement (PPI)

model is an approach that acknowledges

the need to include patients and the

public in research. PPI is flexible and can

consist of varying degrees of participation

BY ZOE POSNER

in the research process. Involvement often

focuses on including patients in research

design and in disseminating results. Less

commonly, it encompasses participation

in data analysis and methodological

design. Significantly, PPI methods can be

implemented in a broad range of research

areas, from clinical studies and cancer

research to groundbreaking research in

the basic life sciences.

PPI offers several benefits over traditional

research, which is executed solely by the

investigating team of scientists. As implied

by its name, PPI democratizes research by

increasing the number of voices included

in the research process. Researchers are

optimistic that including public voices

can produce studies that are more ethical

and practical in nature. In health studies

especially, patients can offer critical and

overlooked perspectives in the research

process. They can highlight aspects

of the disease and treatment that are

deprioritized in the academic community

(such as drug-induced cytotoxicity), and

also help researchers identify the most

pressing pathobiological questions in the

patient community.

Recently, PPI is gaining traction for

its potential to address representational

inequality in research. There is an extreme

lack of representation in biomedical

sciences and health professions. As a

result, the unique health perspectives and

grievances of minority populations are

easily overlooked. Additionally, those who

participate in clinical trials or experimental

drug treatments are typically those with the

best access to healthcare––most frequently,

individuals who are affluent and white.

These factors contribute to persistent

disparities in health outcomes. PPI, by

including a diverse patient population, can

help foreground the health experiences

of marginalized populations. Researchers

who conduct PPI studies are aware of the

need to increase diversity in research.

While this awareness is encouraging, many

of these researchers recognize that thus

far, a majority of patient involvement in

research is by wealthier white patients.

Groups like the Community and Patient

Partnered Research (CPPRN), which

focuses on improving mental and

behavioral health outcomes for Black and

Hispanic populations, are working towards

ensuring greater diversity in patient

networks to overcome this challenge.

Finally, patient-centered research holds

promise for improving the study of rare

pathologies, including rare cancers.

Rare diseases are difficult to study due to

restricted availability of patient samples.

Patient-centered approaches to rare

diseases, by generating patient networks,

can facilitate the collection of patient

samples and data over a broad geographic

range. This can compensate for the low

frequency of disease incidence. The Count

Me In Initiative has spearheaded five

projects involving patient-partnered cancer

research. Most recently, the initiative has

launched The Angiosarcoma Project,

which focuses on angiosarcoma—a rare,

notoriously aggressive cancer that develops

in the lining of blood and lymph vessels.

The Angiosarcoma Project: A model for PPI

The Angiosarcoma Project, with 338

patients, is the largest angiosarcoma

project to date and has produced several

novel and high-impact findings. The

project was led by Corrie Painter of the

Broad Institute. In the initial stages of

the project, Painter ensured that patients

were involved in the development

of the online platform that would

then be accessible to them and other

patients. After receiving guidance from

angiosarcoma patients, Painter and

her team built out the project, received

direct patient feedback, and synthesized

that feedback in an iterative process.

This serial generation-feedback-revision

IMAGE COURTESY OF FLICKR

24 Yale Scientific Magazine September 2020 www.yalescientific.org


Genomics / Clinical Research

FEATURE

loop was done for everything seen on the

Angiosarcoma Project Website.

Once the project was launched,

high participation rates were almost

immediate. After joining the project,

participants could give consent to share

online medical records, and send in saliva,

blood, and tumor samples. Throughout

the process, research updates were

continuously disseminated to patients

through the online platform, allowing

patients to provide feedback.

Through the project, over seventy tumor

samples were obtained, allowing for large-

scale whole exome sequencing, a method

that sequences protein-coding regions

in the genome. Painter’s team could also

access medical history data. Integrating

these types of data enabled robust analysis

of multiple subclasses of angiosarcoma.

Sequencing data obtained for patient

tumor-samples elucidated three genes––

PIK3CA, GRIN2A, and NOTCH2––that

were consistently altered in angiosarcoma.

Subsequent analysis of mutation frequency

indicated that the PIK3CA gene is one

of the most commonly mutated in breast

angiosarcoma. This finding is clinically

relevant, pointing to PIK3-alpha inhibitors

as a potential therapeutic route for primary

breast angiosarcoma treatment. Finally,

Painter looked at the mutational burden,

which is the number and type of somatic

mutations in the DNA of cancer cells.

She found that the number of mutations

in head-neck-face-scalp (HNFS)

angiosarcoma is significantly enriched, in

a pattern consistent with UV-light induced

DNA damage. This finding suggests that

this angiosarcoma cohort might respond

well to immune checkpoint inhibitors.

The novel findings produced by this

project are already making an impact. “[Our

project is] decoupled from the publication

process entirely, so we’ve been releasing

data for well over a year,” Painter said. By

presenting at Clinical Oncology Alliance

Meetings and to The American Society of

Clinical Oncology (ASCO), Painter was able

to share her results with clinical researchers.

As a result, “There are three different groups

working on drafting clinical trials. One

of them was able to get angiosarcoma as a

cohort in an existing checkpoint inhibitor

study,” she said. Painter also mentioned that

two additional studies are being currently

developed based on her data. Painter also

anticipates the potential for pre-clinical

researchers, who are not necessarily

studying angiosarcoma, to see results from

her project that involve pathways or genes

of interest and to become interested in

partaking in angiosarcoma research.

The Angiosarcoma Project is a significant

step forward for patient-partnered

research. Painter demonstrated how a

sincere and deep level of collaboration

between patients and researchers can

stimulate the most meaningful and

translationally relevant results.

Challenges and future directions for

PPI in research

Successful patient-partnered research,

like the Angiosarcoma Project, is

popularizing PPI. This is reflected in an

increasing number of PPI studies, as well

as the proliferation of grant applications

that require researchers to describe how

they plan to involve patients of the public

in their studies. As this approach becomes

more popular, it is important to consider

current challenges and future directions

of this type of research.

The single greatest challenge to expanding

PPI in research is scientists’ lack of clarity

on the most effective ways to facilitate

engagement with patients. Painter, when

discussing the Angiosarcoma Project,

noted that building the project, “was

much easier than the metastatic breast

cancer project because we were going off

a vision.” For her, the metastatic breast

cancer project provided a scaffold that

was subsequently utilized to build out the

Angiosarcoma Project in a way that was

tailored to that specific patient community.

Painter’s insight highlights how researchers

can draw from previous studies in order to

guide and facilitate their own PPI studies.

Painter notes that it is imperative to adapt

each project to the needs of the specific

disease community, but that having a pre-

existing vision is still highly useful.

As PPI expands, ensuring a continued

commitment to patient diversity is

critical. One way to facilitate this is by

ensuring that patient-advocates from

diverse backgrounds are included, since

these advocates are the cornerstone of

research outreach efforts.

Finally, while most research integrating

a patient-centered approach involves

clinical research or translational research

that makes use of patient samples,

there is a strong argument for patient-

public involvement in pre-clinical and

basic science research as well. Emma

Dorris is a molecular biologist at The

University College Dublin who also leads

a PPI initiative for Arthritis Research.

She argues that PPI elevates research by

increasing the relevance and impact of

projects. Dorris believes that patients can

provide novel insights that direct scientists

towards areas of a disease’s biology that

haven’t been previously studied. While

in a wet-lab setting patients cannot be

involved in the data collection or analysis,

there is a clear and meaningful space for

their involvement in defining research

questions and goals. In order to encourage

researchers in preclinical labs to effectively

integrate PPI, experts recommend training

in patient-communication, as well as

top-down incentives and infrastructure

support from research institutions.

Patient-partnered research holds immense

promise for biomedical science. It offers to

improve the quality and relevance of research,

improve relationships between researchers and

the public, overcome boundaries to studying

rare diseases, and help ameliorate racial

and socioeconomic inequalities in research.

As PPI studies continue to expand, critical

examination of what types of engagement are

most effective will be necessary. ■

Burns, J. A., Korzec, K., & Dorris, E. R. (2019). From intent to implementation: Factors affecting public

involvement in life science research. doi: 10.1101/748889

Jayadevappa, R. (2017). Patient-Centered Outcomes Research and Patient-Centered Care for Older

Adults. Gerontology and Geriatric Medicine, 3, 233372141770075. doi: 10.1177/2333721417700759

Pii, K. H., Schou, L. H., Piil, K., & Jarden, M. (2018). Current trends in patient and public involvement in

cancer research: A systematic review. Health Expectations, 22(1), 3–20. doi: 10.1111/hex.12841

Staniszewka, S. (2020). A patient–researcher partnership for rare cancer research. Nature Medicine,

26(2), 164–165. doi: 10.1038/s41591-020-0766-y

www.yalescientific.org

IMAGE COURTESY OF PXHERE

September 2020 Yale Scientific Magazine 25


FEATURE

Bioengineering

KEEPING DRY UNDERWATER

LEARNING SUPERHYDROPHOBICITY FROM PLANTS

BY YU JUN SHEN

ART BY ANMEI LITTLE

Hydrophobic materials have many

applications, yet many are easily

disrupted by the environment,

losing their dryness. To find the key

to the next generation of highly waterrepellent

materials, scientists have

turned to Mother Nature for inspiration,

studying species that thrive in water,

like lotus plants and ferns.

Recently, Xiang Yaolei and his team

at Peking University investigated the

hydrophobic leaves of Salvinia molesta,

a sturdy fern species best known for

being highly invasive. The researchers

discovered that the Salvinia leaf

had evolved surface patterns ideal

for generating a smooth layer of air

underwater. They then replicated this

design on a 3D printed specimen.

This work paves the way for improved

underwater applications, such as reduced

drag on underwater vehicle and improved

protection against corrosion in pipes.

In a laboratory study of carefully

degassed underwater Salvinia leaves, the

researchers observed that an air layer

forms spontaneously on the leaf surface.

For accuracy purposes, the researchers

applied a high water pressure to remove

any trapped air on underwater Salvinia

leaves first, then injected new air via a

small syringe. “It is different from a lotus

leaf. Here, a whole layer of air forms, while

for the lotus only a few individual bubbles

appear. The Salvinia mechanism is an

active replenishment of air,” Xiang said.

Scientists and engineers sought to

discover the underlying design behind

the Salvinia leaf ’s

active replenishment

mechanism. Using

a scanning electron

microscope, the team

found three key

features of the Salvinia

leaf ’s hydrophobicity:

i n t e r c o n n e c t e d

w e d g e - s h a p e d

microgrooves on the

leaf ’s surface, long hair

stems, and egg-beater

shaped heads. The wedgeshaped

microgrooves

enabled stable air pockets to

form and expand by capillary

action, going against the flow

of gravity. The interconnected and

widespread microgrooves allow air to

spread efficiently and spontaneously

across the leaf surface.

The researchers found that, once

formed, the air layer then rises along

the frame provided by the hairy stems.

Moreover, the eggbeater-shaped head,

which caps the microgrooves and hair

stems, stabilizes the entire air layer by

surface tension. As the stems of the plant

are irregular, the air layer arrived at the

top of short stems will pin and “wait” for

the air layer arrived to the top of high

stems. These three features combine

to actively replenish the air layer, even

in the presence of water flow. Hence,

compared to passive hydrophobic

materials, the Salvinia design is more

reliable in sustaining an air layer.

Xiang is enthusiastic about the

industrial applications of Salviniainspired

hydrophobic materials. “As it is

an efficient way to protect the air mattress

in different environmental conditions, it

will expand the applications of superhydrophobic

surfaces, especially in

extreme environments,” Xiang said.

An application of particular interest

is the anti-corrosive coating on ship

keels. Corrosion below the waterline

weakens and damages the ship. A passive

solution is anti-corrosion chemical

paint, though it degrades over time and

leaves environmental residues. Current

active systems inject bubbles to stick

to the surface, but this must be done

continuously and while fully submerged.

26 Yale Scientific Magazine September 2020 www.yalescientific.org


Bioengineering

FEATURE

IMAGE COURTESY OF WIKIMEDIA COMMONS

Hydrophobic leaves help aquatic plants thrive in

wet environments.

Xiang believes an industrial translation

of the Salvinia system—microgrooves,

long stems and eggbeater-shaped

heads—could create a self-replenishing

active system that prevents prolonged

wetting along the ship hull. A similar

approach might protect pipelines as

well, reducing maintenance costs and

improving water quality.

Another potential application is drag

reduction to enable faster underwater

vehicles. An object moving in a fluid

experiences a resistance to its motion,

which can be reduced significantly

if the object is enveloped in an air

cocoon. Compared to current methods

of generating this air cocoon, Xiang’s

research may have wider usage. “Methods

like supercavitation work only at high

speeds,” Xiang said. The Salvinia-based

design could achieve drag reduction

“even at lower Reynolds numbers,”

where the object moves slowly in a less

turbulent manner. In the lab, the air

layer is retained at low fluid velocity of

half a meter per second. Further research

would be needed to scale up this speed

for use in ships, where the water flows at

a few meters per second.

To replicate Salvinia’s natural patterns

on artificial surfaces, the Peking

University researchers 3D printed a

specimen with regular microgrooves,

long stems, and eggbeater bulbs. Due

to the extreme precision required, the

overall sample size was a square of four

millimeters in length. The team printed

a Salvinia-inspired design (complete

with microgrooves, stems and eggbeater

heads) as well as a control specimen

(with stems and eggbeater heads only).

Using a confocal microscope, Xiang and

the researchers found that a smooth layer

of air only appeared in the first sample.

In the control, individual air bubbles

formed instead. Hence the Salvinia’s

microgrooves are an essential component

to achieving a smooth air layer, even more

so than the hydrophobic lotus leaf, which

only has long stems and eggbeater bulbs.

The lotus, as represented by the control,

cannot recover its air layer once disrupted.

In a further investigation using the

3D printing apparatus, the researchers

varied the microgroove angles to test the

predictions of their air layer formation

theory. Using a thermodynamic free

energy model, the researchers calculated

the angle requirement for a stable air

layer to form spontaneously. The natural

Salvinia leaf ’s microgroove angle matched

that range, and additional 3D printed

tests verified the range of full, partial,

or no expansion. Again, experiments

showed that the microgroove pattern

is critical for the Salvinia plant’s strong

hydrophobic capability.

Following his lab’s focus on boundary

layer stability research, Xiang and his

team’s research on the Salvinia leaf gives

new understanding of the boundary

layers of a submerged body. “We wanted

to find out what made the air layer on

Salvinia plant so stable. It turns out

the hydrophobicity gives an extremely

strong adaptability to environmental

conditions. This also provides a

theoretical basis for the design of

artificial bionic materials,” Xiang said.

The Salvinia leaf, which actively

replenishes its air layer through a discreet

anatomy, is one of the most effective waterrepellent

surfaces we know of currently. A

humble leaf holds many secrets. ■

Xiang, Y., Huang, S., Huang, T., Dong, A., Cao, D., Li, H., Xue, Y., Lv, P., & Duan, H. (2020). Superrepellency

of underwater hierarchical structures on Salvinia leaf. PNAS, 117(5), 2282-2287.

www.yalescientific.org

September 2020 Yale Scientific Magazine 27


THE NEW 98.6 DEGREES

How and why human body temperature has lowered

COUNTERCOUNTER

BY

KELLY

FARLEY

KELLY

POINT

IMAGE COURTESY OF ROY PERRY

When you go to the doctor, the first measurement

taken—whether it’s an annual check-up or a sick

call—is your body temperature. Over the years, the

method has varied, from mercury thermometers under the

armpit to infrared thermometers scanned over the forehead. But

the standard indicator of health has remained the same since

1851, when it was first reported by the German physician Carl

Reinhold August Wunderlich. As you’ve probably heard before,

the normal human body temperature is 98.6 degrees Fahrenheit.

Julie Parsonnet and her colleagues at Stanford, however, have found

evidence that disrupts this paradigm: they recently reported that that

human body temperature has steadily decreased over the past two

centuries to a current average of 97.9 degrees Fahrenheit. In the most

comprehensive analysis to date, their study examined hundreds of

thousands of temperature measurements from three databases ranging

from the end of the Industrial Revolution to present day. In the end,

they found a constant decrease in temperature from decade to decade.

Parsonnet’s study is important not because it shows that human

body temperature is lower but because it shows that it has dropped

since the nineteenth century. Medical professionals have known for

the past few decades that healthy human body temperature is lower

than the 98.6 degrees Fahrenheit standard. A Russian pharmacy

chain founded in 1991 is even named 36.6—the Celsius equivalent

to 97.9 degrees Fahrenheit—in honor of the more accurate lower

temperature. Scientists previously assumed that 98.6 degrees

Fahrenheit must have been wrong at the time of measurement

as well, blaming differences in historical measurement methods

and instrument calibration. According to Parsonnet, the original

number may not have been wrong at all. It is just no longer accurate

for modern humans. The question is: why?

“Temperature is a marker of metabolism,” Parsonnet said. With

lower body temperatures, our metabolism must be slower. Perhaps

this change is caused by our more temperate environments.

With heating and air conditioning, modern Americans live

in the “thermoneutral zone” of sixty-four to seventy-two

degrees Fahrenheit in which our bodies do not have to increase

metabolism to keep warm. With inactive modes of transport and

sedentary desk jobs, we also move less, further suppressing our

metabolic rate and possibly explaining the rise in obesity.

Parsonnet prefers an alternate explanation: our cleaner

environments. Thanks to sewage systems, hand sanitizer, antibiotics,

and modern infrastructure, we have decreased the rates of formerly

widespread infections, such as syphilis, rheumatic heart disease, and

tuberculosis. Thanks to vaccination, we have minimized infectious

diseases of the past and have hopes to apply the same to infectious

diseases of the present. All of this leads to decreased inflammation,

which in turn leads to lower metabolism and thus a lower body

temperature. It is uncertain whether this decreased inflammation is

true everywhere; the modern temperatures in the study were collected

in the United States. Moving forward, Parsonnet is interested in

examining temperatures in developing countries as well.

If the “normal” human body temperature is lower, does this

challenge the way we approach fever and medical diagnosis?

Though taking your temperature may be the first thing your doctor

does, it is never the last. Think about the family history, blood

pressure readings, throat swabs, blood draws, and everything else

that goes into a sick call. We are drawn towards binaries: sick versus

healthy, feverish versus normal. But temperature varies from

person to person and even within a person over the course of a day.

The real question is not why our 98.6 degrees Fahrenheit

standard was wrong but rather why we rely on it so heavily

when every person is an individual. There is no overall “normal”

human standard. Instead, “There is a ‘normal’ for each person

that depends on their age, sex, weight, height, and the time

of day their temperature was measured,” Parsonnet said. Her

team is already working on an algorithm that determines what

is abnormal for an individual patient at any particular time. In

the age of big data and personalized medicine, we are beginning

to see patients as individuals instead of averages. Today, we are

surprised that the 98.6 degrees Fahrenheit standard has been

replaced by 97.9 degrees Fahrenheit. In the future, we may be

surprised that we relied on an average at all. ■

Barondess, J. (2014). Scanning the Chronic Disease Terrain: Prospects

and Opportunity. Transactions of the American Clinical and Climatology

Association, 125(2014), 45-56.

Fischer, K. (2020, January 20). Forget 98.6 degrees Fahrenheit. Humans Are

Cooling Off — Here’s Why. Healthline.

Protsiv, M., Ley, C., Lankester, J., Hastie, T., & Parsonnet, J. (2020). Decreasing

Human Body Temperature in the United States Since the Industrial

Revolution. eLife 9e49555.

28 Yale Scientific Magazine September 2020 www.yalescientific.org


VS.

SCIENCE

THE APOCALYPSE

ANTIBIOTIC

RESISTANCE

BY VICTORIA

VERA

IMAGE COURTESY OF WIKIMEDIA COMMONS

Since the discovery of penicillin, the first commercialized

antibiotic, in 1928, our society has dramatically improved.

We have raised life expectancy, improved quality of life,

and altogether created a healthier world. However, it’s not just

us who have adapted. Bacteria have entered a new age as well,

one characterized by increasing resistance to the antibiotics we

create. Antibiotic resistance was first observed in 1947, around

six years after commercial production of penicillin began. Since

then, the problem has become much more widespread. This poses

a scary new problem for us. In the future, might it be possible

for an infected piercing or scrape to bring us to our deathbeds?

Worldwide, scientists have been working tirelessly to prepare for

when our main line of defense finds itself compromised. While

studying antibiotic-producing bacteria known as actinomycetes,

researchers in the Wright lab at McMaster University in

Canada stumbled upon possible solutions: two new functional

antibiotics, and a way to predict more.

All this was a result of mapping ancestry. Rather than directly

searching for a new antibiotic, Wright’s research team sought to

investigate more broad-ended ideas. The motivating factor was a

simple question: “What are the origins of antibacterial resistance?”

Wright said. Many antibiotics, including penicillin, are derived from

biological organisms. Wright and his team first sought out ancestral

history of the antibiotic properties of actinomycetes, bacteria

found in soil that manufacture many of our current antibiotics.

Actinomycetes derive their antibiotic-producing capabilities from

biosynthetic gene clusters (BGCs)—groups of two or more genes

that, coupled together, encode a pathway for the production of a

specific metabolite, such as a product with the antibiotic properties

we need. The researchers first gathered sequences of antibiotic

BGCs from multiple actinomycete species. Then, they began slowly

building phylogenetic trees—diagrams mapping out evolutionary

relationships—of the BGCs, looking for a common ancestor. As

this effort advanced, they noticed previously untapped antibiotic

BGCs. This gave rise to another fundamental question: “Where do

these things come from?” Wright asked.

As the researchers continued to investigate, they found that some

of these newly discovered gene clusters encoded products that

blocked bacteria in entirely different ways than existing antibiotics.

These genes could then be purified and expressed—taken from the

bacteria in question and “shown [without other genes] in the way,”

www.yalescientific.org

Wright said—to create new antibiotics. The researchers had found

a way to predict possible antibiotics derived from actinomycetes.

Due to this mapping, they were able to specifically find two new

functioning glycopeptide antibiotics: complestatin and corbomycin.

Glycopeptide antibiotics combat bacteria by binding to peptidoglycan,

an important substance that makes up cell wall. Complestatin and

corbomycin have a novel mode of action. Unlike other antibiotics,

which prevent the bacterial cell wall from being built, complestatin

and corbomycin keep it from being broken down, a critical step

during bacterial reproduction. As a result, the targeted bacteria cannot

divide and increase their numbers, and their harmful properties are

blocked. In mouse models, complestatin and corbomycin diminished

infection while maintaining a low rate of resistance development, a

promising sign in this early stage of research.

Wright’s team faced several challenges on the way. For one, the mere

act of constructing phylogenetic trees presented difficulties, as they had

to comb through many genetic sequences to find the links proving their

evolutionary relationships. Similarly, challenges also arose in purifying

and expressing these genes once they were identified. The researchers

faced a game of trial and error, changing a range of conditions to

investigate their effects on the production of functional antibiotics by

the bacteria. It was “like fishing in a pond,” Wright said; in this case,

they were looking for a rather small fish in a very large pond.

Where will this new discovery head? “Our plan is to continue to

look for new antibiotics of this new family,” Wright said, noting that

his lab has already identified several potential leads. He also hopes to

extend the methods used in this paper to investigate another antibiotic

family. “We have not yet decided on which one, but we are very hopeful

that the method will uncover new compounds,” Wright said.

Antibiotic resistance, especially today, is a larger problem than

you might envision. A world where antibiotics no longer work is

a world where a small cut could have a prognosis as foreboding as

cancer. We have to start looking at more creative ways to solve this

growing crisis. Wright’s team has begun thinking outside the box

already, which begs the question: what can we expect next? ■

Culp, E.J., Waglechner, N., Wang, W. et al. (2020). Evolutionguided

discovery of antibiotics that inhibit peptidoglycan

remodelling. Nature 578, 582–587. https://doi.org/10.1038/

s41586-020-1990-9

September 2020 Yale Scientific Magazine 29


ALON MILLET (BR ’20)

BY KATHERINE DAI

IMAGE COURTESY OF ALON MILLET

No one can capture themselves in three

words—let alone someone who juggles

writing a master’s thesis, giving campus and

science tours, and peer tutoring for the introductory

biology sequence for the sixth semester. But when

asked about it, Alon Millet (BR ’20) rose to the

challenge: greedy for knowledge.

“What’s amazing about biology is that you can start at

the atomic level with biophysics and scale up to systems

biology, which is my field. Every step along the process,

you can see subtle connections as the scale changes,”

Millet said. Drawn to the unsolved mysteries in biology,

he satisfies his curiosity in the lab—probably even more

than in the classroom—a habit that started during his

freshman year of high school. Looking for a “nice side

thing to do,” he joined his high school’s cell biology lab

on a whim. After designing his first set of experiments,

however, Millet knew that research would not only

become a full-time commitment but also a lifelong one.

He dedicated every spare second to tackling a daunting

challenge: addressing the global food insecurity crisis.

His solution of a plant steroid—specifically, a seed

coating that increased the agricultural yield per plant—

earned him meetings with Barack Obama and Bill Nye,

a patent, and the opportunity to work with the U.S.

Agency for International Development.

After four years focused on research over

conventional high school experiences, Millet gained

the confidence and initiative to join a lab within a week

of his first year at Yale. He hit the ground running,

ready to go all-in on research through the BS/MS track

for Molecular, Cellular and Developmental Biology.

Halfway through his sophomore year, he co-authored

his first publication in Science Immunology.

But his end goal of research extends beyond

publications. He finds fulfillment from saying yes to

two questions: if he learned something new, and if

he satisfied his curiosity. The possibility of reaping

new knowledge, and perhaps the thought of a few

lines in the next edition of a biology textbook, keeps

him motivated. “If I’m on a question, maybe I can be

the person who cracks it. If I stopped, I would never

know the answer,” Millet said.

UNDERGRAD

PROFILE

The next big step for Millet is his master’s defense.

His thesis focuses on his research at the forefront of

cancer immunology, in Professor Sidi Chen’s lab. He

is working to understand how the metabolic state of

immune cells influences their response to tumors.

This work holds the potential to open undiscovered

therapeutic avenues, as insights about immune cell

metabolism can be translated into new approaches in

cancer immunotherapy.

In April, Millet will present his thesis to a

committee of illustrious scientists: Nobel laureate

James Rothman, Mark Mooseker, Tom Pollard,

and Sidi Chen. “It’s not every day you have a Nobel

laureate on your thesis committee. But beyond the

fact that they’re incredible scientists, they have been

unbelievably supportive whether I struggled with the

actual science or more personal issues,” Millet said. He

says this is a rare find: scientists who are doing worldclass

work and want to train the next generation of

scientists who could be doing that world-class work.

And he does not take this mentorship for granted.

Guided by the desire to follow in their footsteps and

invest in the development of others, Millet has the longterm

goal of becoming a professor. Next year, he will

attend the Tri-Institute PhD Program in Computational

Biology and Medicine. Meanwhile, he has taken up peer

tutoring for the introductory biology sequence as an

intermediate along the path of academic mentorship. “I

have given some great tours and have done some good

science in the lab. But out of all the things I’ve done on

campus, I think the one I’m proudest of is being a peer

tutor and how I have peer tutored,” Millet said. He finds

excitement in bonding with students over encountering

new information and reliving his experience of learning

it for the first time. At his many late-night review

sessions, his passion is palpable as he discusses biological

principles, experimental design, and his legendary mock

exam questions in front of over a hundred students.

For Millet, the connections he has formed with

students are more than channels to share knowledge.

He hopes that they have become avenues to transmit

his love for biology and greed for knowledge—his way

of paying the favor forward. ■

30 Yale Scientific Magazine September 2020 www.yalescientific.org


ALUMNI ALUMNI

PROFILE

From his humble upbringings in Taiwan to his

current position as Vice President of Science and

Technology in the Research Division at IBM, Tze-

Chiang Chen (PhD ’85) has established himself as one of

the most influential researchers in electrical engineering.

Born in 1951 to two teachers, Chen has maintained his

scientific curiosity since childhood. “I always wanted to

make something work by using mechanical, electrical, or

optical components,” Chen said. “That is what inspired

me to study science and technology.”

He used his childhood interests in science to pursue both

a bachelor’s (’74) and master’s degree (’76) in physics at

the National Cheng-Kung University in Taiwan, and paid

special attention to particle physics for his MS degree. After

serving in the army for two years, Chen then received a

scholarship from Yale to study physics. “Not only did Yale

have a very strong particle physics team, but it also had a

long history with China,” Chen said. “It was a dream to

come to Yale.” Chen arrived in 1978, receiving a master’s

degree a year later in Engineering and Applied Science and

pursuing the field until he graduated with a PhD in 1985.

Chen has worked on many impressive projects over

the years. While at Yale, around 1980, Chen was offered

an opportunity to work at PerkinElmer, a company

commissioned by NASA to help develop the Hubble

Space Telescope. Consequently, Chen drove between

New Haven and Wilton, Connecticut for nine months

as he balanced his studies and research work, ultimately

developing a process that allowed for the mirror coatings

used on the Hubble project. Thanks in part to his effective

contributions, the Hubble Space Telescope launched in

1990 and is still in operation today. For Chen, this first

work project still occupies a special place among all his

achievements. “I solved a problem and designed all the

thin-film coating parameters… It is perhaps the project I

am most proud of today,” he said.

Chen continued to solve problems and design parameters

when he joined the IBM team in late 1984 as a research

staff member. “IBM was known to be an innovation

company, and it provided a great opportunity for pioneering

semiconductor research,” he said. In fact, Chen’s primary

focus was on semiconductors, materials that partially

conduct current and are essential for many electrical

IMAGE COURTESY OF TZE-CHIANG CHEN

devices. His groundbreaking work during the eighties

on double-poly bipolar technology production laid the

foundation for semiconductor devices being implemented

in IBM mainframe computers used worldwide for scientific,

banking, and other commercial applications. Throughout

the nineties, Chen advanced dynamic random-access

memory (DRAM) density, allowing semiconductors to

store more data efficiently. And at the turn of the twentyfirst

century, Chen led a team of researchers in innovating

a dielectric material for complementary metal–oxide–

semiconductor (CMOS) chips, which store small amounts

of memory on computers. This work resulted in a global

push for silicon microelectronic use at many semiconductor

companies. Since joining IBM thirty-five years ago, Chen

remains an indispensable part of the company, overseeing

the science and technology strategy for five laboratories

across the world and continuing to monitor multinational

work on other engineering projects.

Chen has accumulated many accolades throughout

the years, receiving IBM’s highest honor when he was

appointed to be a fellow for the company in 1999. He later

became a fellow for the American Physical Society and the

IEEE. In 2011, Chen received the Institute of Electrical and

Electronics Engineers (IEEE) Ernest Weber award for his

high managerial achievement.

Underlying Chen’s ambitions to come to Yale was an

admiration for Yung Wing, a Yale graduate of 1854 who

was the first Chinese student to receive a diploma from

an American university. Given his illustrious career,

it comes as no surprise that Chen was named Asian

American Engineer of the Year in 2005.

“I want to encourage Yale undergraduate students

to engage more in science and technology, given Yale’s

enormous amount of resources,” Chen said, when asked

to offer some advice to current Yale students. He also

advised students to persist in the face of science’s many

challenges. “[View hurdles] as an opportunity rather

than a barrier… Through passion and perseverance, you

can achieve something that makes you happy,” he said.

Once a recruiter for IBM, Chen now returns

occasionally to Yale’s campus—when he’s here, he can

typically be found at the Becton Center. If you’re lucky,

perhaps one day you’ll meet him! ■

BY NADEAN ALNAJJAR

TZE-CHIANG CHEN (PHD ’85)

www.yalescientific.org

September 2020 Yale Scientific Magazine 31


Interested in getting

involved with YSM?

Subscribe to our

email updates at

IMAGE COURTESY OF GIUSEPPE DONATIELLO

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!