- The Aurorean
- Posts
- #15 | The First Algae To Fix Nitrogen
#15 | The First Algae To Fix Nitrogen
+ using AI for retinal imaging, personalized cancer vaccines, and more
Hello fellow curious minds!
Welcome back to another edition of The Aurorean.
In case you missed the details last week, we are hosting our first raffle giveaway at the end of April to celebrate our 3 month anniversary and growing community of thousands of STEM enthusiasts!
Complete our short survey form no later than 11:59pm EST on April 30th for eligibility to be one of 3 subscribers who will win a $50 Visa gift card.
As a reminder, if you have already taken the survey form we have shared these past few weeks, then you are already eligible and you do not need to take any further action. If you have not filled out our survey and want to be eligible for our raffle giveaway, click the link below.
One other quick point: last week’s poll broke yet another record in our reader participation rate. We love to see the participation in our readership grow, and we were pleased to see that 37% of participants said 15 - 30 minutes was the ideal length of time to read in-depth about a STEM topic of interest. Our forthcoming neuroscience deep dive will be within this estimated reading length, and we can’t wait to share it with everyone! We’re in the midst of editing our drafted work and designing the accompanying visuals to create a clear and cohesive final product.
Here’s one other poll question we’ve been meaning to ask:
My favorite type of podcast format to listen to for STEM topics isNote: you can only select one option |
With that said, on to the news. Wondering what STEM discovered last week?
Let’s find out.
Quote of the Week 💬
Scientists Discover The First Organelle That Can Fix Nitrogen
“It’s very rare that organelles arise from these types of things… The first time we think it happened, it gave rise to all complex life.”
⌛ The Seven Second Summary: An international team of scientists announced the discovery of the first known organelle that can convert atmospheric nitrogen gas into ammonia or other usable nitrogen compounds to support plant life.
🔬 How It Was Done:
In 1998, the researchers discovered a unique DNA sequence of a nitrogen-fixing bacteria in the Pacific Ocean. They called this bacteria UCYN-A.
The researchers then spent decades gathering samples of both UCYN-A and the marine algae it was found in, Braarudosphaera bigelowii, to study the behaviors and growth rates of these organisms in isolation and together in shared environments.
During this research process, the scientists also isolated, identified and labeled various proteins created by the bacteria and marine algae. They discovered the marine algae produced certain proteins that UCYN-A did not, however, these proteins managed to find their way into the bacteria and enable an organelle inside UCYN-A to perform nitrogen fixation.
🧮 Key Results: The organelle inside UCYN-A that utilizes the marine algae’s proteins to fix nitrogen is just the 4th example of endosymbionts known to science. The researchers estimate this 4th endosymbiotic relationship originated ~100 million years ago, which is noteworthy because the other known examples of endosymbiotic relationships originated billions of years ago.
💡 Why This May Matter: Endosymbionts is the phenomenon where an organism lives within the body or cells of another organism. These sorts of organelles play an essential role in understanding our world’s evolutionary history. For instance, mitochondria and chloroplasts are two other notable examples of endosymbiotic organelles, and they are largely responsible for the development of complex life on Earth.
🔎 Elements To Consider: It is possible that endosymbiotic relationships originated many more times than the 4 known instances to science, but it is exceedingly difficult for researchers to identify organelles where this may have occurred, disambiguate the distinct organisms involved, and reproduce the evolutionary steps involved.
📚 Learn More: UC Santa Cruz. Paper 1. Paper 2.
Stat of the Week 📊
AI Makes Retinal Imaging 99x Faster and 3.5x Sharper
99x
⌛ The Seven Second Summary: Researchers from the United States’ National Institutes of Health used AI to dramatically improve the speed and clarity of how they image cells in the eye.
🔬 How It Was Done:
Speckle is the phenomenon where retinal images are obscured by image brightness. To counteract this phenomenon, scientists repeatedly image cells in the retinal until the speckle shifts position in their image and different parts of a cell are visible. Afterwards, the scientists undergo a laborious process where they piece all of their cell images together and create a final image that is speckle-free.
To improve this process, the researchers developed an AI system to recognize the differences between speckled and unspeckled images by training it on over 6,000 images.
When they tested the AI with speckled images it had not seen before, the system was able to remove the speckle from the images and sharpen the overall image quality beyond what was possible from the researcher’s manual process.
🧮 Key Results:
Trained researchers typically require 120 images to create a final, speckle-free image. This AI system only needed to process 1 image to create a speckle-free image of similar quality to the experts.
Additionally, the AI drastically reduced the time required for the researchers to process retinal images, from ~13 days to just under 3 hours. This represents a 99x improvement.
Furthermore, when the AI system processed their images, the images had a 3.5x better image contrast resolution when compared to their existing methodology.
💡 Why This May Matter: This study is an example of the sorts of AI use cases that can profoundly impact the field of ophthalmology and medicine more broadly. Imaging the retina with better clarity and significant time and cost savings may allow clinicians to detect and monitor the development of various eye diseases earlier than before, which can eventually lead to better diagnostics and more effective treatments for patients in need.
🔎 Elements To Consider: This study only used imaging data from 8 eyes of 7 different participants. It is unclear if this AI system is generalized well enough to be valuable for a larger, more diverse patient population.
📚 Learn More: National Eye Institute. Nature.
AI x Science 🤖
Credit: Markus Spiske on Unsplash
Google Researchers Share How To Scale AI Context Length
Researchers from Google released a paper explaining a technique to improve the context length of an Large Language Model (LLM) to 500,000 tokens or more.
LLMs are trained on transformers, which are a type of system architecture that popular AI models like Chat GPT are trained on.
Typical transformers reset their attention memory after each context window to manage the costs and complexity of accounting for new data it receives. For example, if a model is given a book series with 500,000 words, this architecture might split the book series into 5 different parts known as context windows of 100,000 words each. In this example, each context window has an understanding and memory within its own segment of data, but it does not have knowledge, understanding or memory of data about the 4 other context window segments. This is like receiving the entire Harry Potter book series to read, except you can only recall and understand one book at a time.
To overcome this limitation, the researchers created a global context mechanism where they effectively compress, summarize and store the key values and information of each context window segment for cheap and easy system retrieval. Afterwards, they integrate these context window summarizes together and merge them with the previous architecture so their AI system has a bridge to understand all 500,000 words together rather than in 100,000 context window increments. In our Harry Potter example, this is like creating a concise book summary for each novel in the series to capture the essential plot points, themes and key events, then joining these book summaries together to create an organized overview of the entire narrative for future reference.
When the researchers scaled up this system architecture to larger AI models, their model demonstrated state-of-the-art performance on book summarizations up to 500,000 tokens (ie. ~500,000 words). Furthermore, this approach led to significant memory efficiency gains compared to alternative methods, with some cases achieving up to 114x less memory usage. Incredible.
While it is unclear if Google used a similar system architecture for their proprietary Gemini 1.5 Pro model, their AI currently has the largest context window available to customers in the world. Furthermore, the Gemini team has already demonstrated that increasing model context also improves overall model performance, so these breakthroughs are meaningful even if models cannot perfectly recall knowledge with novel system designs. This field of research has made remarkable progress in the last 2-3 years. For reference, in 2021, a 4,096 token context was considered state-of-the-art. arXiv.
Our Full AI Index
Digital Twins: Researchers from the University of Chicago developed a generative AI tool to create a virtual model of an infant’s microbiome. The system used this data to predict neurodevelopmental deficits that may emerge in the babies over time, and clinicians acted on these predictions to intervene with effective treatments before serious complications emerged. University of Chicago. Science.
Open Source: Cohere recently launched Command R+ their latest large language model yet. Their AI model has quickly ascended to the 7th spot on Hugging Face’s LMSYS leaderboard, which suggests it the most advanced open-source model on the market today by far, and only trails the most sophisticated models like Claude 3 Opus and Chat GPT-4 Turbo. Cohere.
Turing Award: Computational scientist and mathematician Avi Wigderson received the 2023 AM Turing Award for his seminal work about randomness in computation. The Turing Award is one of the most prestigious awards in computer science, and it is named in honor of the British mathematician Alan Turing, who helped develop a theoretical foundation for understanding machine computation AM Turing.
Semiconductor Infrastructure: The White House announced the U.S. and Taiwanese chipmaker, TSMC, have reached a deal for the company to build 3 advanced semiconductor factories in Arizona in exchange for $11B in grants and loans. The White House.
Other Observations 📰
Credit: Markus Spiske on Unsplash
Personalized Cancer Vaccines Are Showing Promise In Trials
The American Association for Cancer Research (AACR) hosted its annual meeting earlier this month, and personalized cancer vaccines were a recurring theme from multiple studies during the event.
For example, biotech companies BioNTech and Genentech presented data on a personalized cancer vaccine they developed called autogene cevumeran. It targets pancreatic cancer, which kills 87% of patients within five years of diagnosis.
In their Phase 1 trial, 16 patients had their pancreatic tumors removed and received the vaccine along with chemotherapy and a monoclonal antibody therapy. While only 50% of the patients showed an immune response to this treatment, 6 of the 8 patients who did respond to the treatment are still disease free 3 years after the vaccine was administered. In contrast, 7 of the 8 patients who did not have an immune response to the treatment have experienced cancer recurrence.
Other research also showed promising results, such as data from Moderna and a separate study from Transgene scientists.
While these are all early studies with preliminary data, there appears to be some cautious optimism from the field that personalized cancer vaccines will eventually live up to their promise in larger trials. It will take years for this potential to fully blossom, but the prospects are exciting nonetheless.
Our Full Science Index
Alzheimer’s Disease: Researchers from Columbia University identified a specific gene mutation that may reduce the odds of developing Alzheimer's disease by up to 71%. Columbia University. Acta Neuropathologica.
mRNA Atlas: Researchers from Cornell created an extensive atlas detailing messenger RNA (mRNA) variants in mouse and human brains. This atlas reveals intricate variations in mRNA genes across brain regions, cell types, and developmental stages of a brain, and these results may help other researchers better understand the molecular mechanisms underlying neurological disorders. Cornell. Nature.
3D Printing: Researchers from MIT developed a 3D printer capable of automatically identifying the printing parameters for materials it has not used before. This process is typically done manually, so this technical breakthrough may broaden the range of usable materials and accelerate the pace of additive manufacturing. MIT News. Integrating Materials and Manufacturing Innovation.
Drinking Water: The U.S. Environmental Protection Agency announced its ruling to restrict PFAS exposure in the country’s national drinking water to 4 parts per 1 trillion. While this ruling does not account for all PFAS chemicals, it is the first ever PFAS ruling of its kind. Similarly, the EPA also issued new rules to force hundreds of chemical plants across the U.S. to reduce cancer-linked toxic chemicals they emit into the air. The White House.
Media of the Week 📸
Robots Learn Soccer Via Deep Reinforcement Learning
Google Deepmind shared the results of their research to teach robots how to play soccer via deep reinforcement learning. This video is a demonstration of the robot’s skills after training, and results are both cute and impressive. Science.
A Nebula Gas Cloud Envelopes The Stars
A cloud of gas and dust surrounding a pair of stars. Credit: ESO/VPHAS+ team. Acknowledgement: CASU
New data from the European Southern Observatory suggest a star system 3800 light-years away from Earth. These stars clashed and merged together, which resulted in the nebula gas cloud surrounding the star system. ESO. Science.
50+ Ocean Species New To Science Discovered In Chilean Waters
Credit: ROV SuBastian / Schmidt Ocean Institute
Remember the story we shared in February about an international team of scientists who reported discovering 100+ species likely new to science off the coast of Chile? The same team recently embarked on a follow up ocean expedition and reported discovering 50+ more species that are likely new to science. Schmidt Ocean Institute.
This Week In The Cosmos 🪐
April 23: A full moon.
Credit: Melanie Magdalena on Unsplash
That’s all for this week! Thanks for reading.