Many types of spiders produce a strong, sticky, proteinaceous fiber from their butt spinneret gland. They use it to build webs, sometimes quite pretty and sometimes spooky, to catch their prey in.
Fun fact: even spiders that don’t build webs produce silk, and it is important in, you know, seducing the other sex.
Spider webs are useful for many things, reproduction (as mentioned), but also to capture and immobilize prey, build nests, move around in the world (some spiders build tiny parachutes), communicate, and leave pheromonal trails. Spider silk is known to have exceptional mechanical properties, having a tensile strength comparable to that for high-grade steel, and a toughness that equals some synthetic polymers.
(Tensile strength relates to the maximum force to which a material can be pulled before breaking, while toughness relates to how much a material can deform and absorb energy before breaking. In any case, spider silk is a natural material that material scientists would just love to emulate. Biomimetics, you know.)
Spider silk is also very sticky. You know. To catch the foods.
Feelin’ those good vibrations
For those spiders who use their web to capture prey, vibrations are a key to success in their endeavor. When an unsuspecting fly, mosquito, or human, wanders into the web, it induces a vibration that the spider can easily distinguish from oscillations created by a breeze, thanks to tiny little hairs that cover their body and legs.
“The virtual reality environment is really intriguing because your ears are going to pick up structural features that you might see but not immediately recognize,” Markus Buehler of MIT explained. “By hearing it and seeing it at the same time, you can really start to understand the environment the spider lives in.”
Here’s an example of one of their spider web sonifications:
Talking to spiders
Each web strand has a different length, which the scientists translated to a sound frequency to create a musical cacophony (if we’re honest) based on the vibrations created by a perturbation. The researchers were even able to develop an algorithm to differentiate between different types of vibrations that might occur, such as “trapped prey,” “web under construction,” or “hot spider just wandered into my web and wants to get busy.
“Now we’re trying to generate synthetic signals to basically speak the language of the spider,” Buehler said. “If we expose them to certain patterns of rhythms or vibrations, can we affect what they do, and can we begin to communicate with them? Those are really exciting ideas.”
But for now, I would just like to imagine spiders as tiny little violinists, creating music with their webs, mocking the sorrowful life of the individual how just blindly wandered into their web.
Bonus number 1: Scientists gave some spiders some drugs. And then those spiders spun some webs. And they looked weird.
I have not made it a big secret that I think penguins are pretty cool. (Does that count as a pun? “Cool,” because they live on the South Pole, get it? Get it?)
So to end this crazy year in style, I want to share some of the news and novel science related to our favorite tuxedo-wearing friends. In style, because tuxedoes are fancy! (Get it?)
Sidenote: While researching “Best Penguin Moments of 2020” I learned that the Pittsburg Penguins are a hockey team (and not a lovely group of penguins in the Pittsburg zoo) and that there are many top Penguin book lists circulating on the internet. Not quite what I was looking for!
1. Penguin picture wins Ocean Photograph Award 2020
Starting things off with some cuteness, a picture of two penguins that had apparently lost their penguin partners and were seemingly comforting each other, won the Community Choice Award at Oceanographic magazine’s Ocean Photograph Awards 2020.
That’s all you need to know. Now wallow in cuteness.
2. Penguin Birthday Party
Wellington, a Rockhopper penguin in Chicago’s Shedd Aquarium who gained viral fame earlier this year thanks to a video of him hanging out with a Beluga whale, celebrated his 33rd birthday this year, with a day of fishy deliciousness.
3. TIL, Penguins get vaccinated too!
Birds get the flu too! And more importantly, birds can get vaccinated against the bird flu!
Let this be a reminder that if you are able too, it is worth getting vaccinated against the flu, and when if becomes available to you, against SARS-CoV-2 as well!
4. Penguins make the best of a bad situation
In a tiny bit of silver lining to climate change, recent research showed that Adelie penguins may actually thrive in warmer years. In years where there is less ice, Adelies spend more time swimming, saving energy, and covering more foraging ground. The researchers predict the population is likely to grow as the ice caps decline.
Finally, after months of not really writing blog-related content, I leaf through the pile of articles from Science Magazine I had ripped out to find inspiration. On the very top of the pile, I find a short piece on the recently (I’m talking March 2020) discovered fossil of Oculudentavis khaungraae – the tiniest dinosaur. Or is it?
Is it is a bird? Is it a lizard?
While doing research, I quickly discover that the original paper was redacted in July – that’s what I get for getting behind on blogging I guess.
In short, the paper published in March describes the discovery of a tiny head (7 mm long) embedded in amber, which was categorised as a bird-like dinosaur making it the tiniest dinosaur ever found. This creature would probably have been about the size of the smallest living bird, the bee hummingbird. The researchers noted that the creature had large eye sockets (Oculudentavis means “eye-tooth bird” so they would have been big on eyes, and toothy), like modern lizards.
Finding the tiniest dinosaur would have been pretty cool. But in June the paper was taken down – apparently, new evidence had come to light showing that the fossil might have not been a dinosaur, and therefore not some type of prehistoric bird, but an unusual lizard.
Despite the etymology of the word dinosaur (“terrible lizard”), dinosaurs are actually more related to birds than they are to modern-day lizards. While the word “dinosaur” does get used as an umbrella term to describe prehistoric reptile-like creatures and depicted as such in children’s books and blockbuster movies, dinosaurs, including the feathered type that survived the mass extinction of 65 million BCE and eventually evolved into what we now know as birds, and reptiles are different things.
Dinosaurs (including birds) do have a common ancestor with reptiles: crocodiles, lizards, snakes, and such: this common ancestor is the archosaur. Crocodiles and other reptiles branched off in the evolutionary tree.
If you find an ancient prehistoric reptile-like fossil, you can tell whether you are looking at a dinosaur or a prehistoric reptile by looking at the hips – for as the ancient saying goes, hips don’t lie. Reptiles have a sprawling stance: their legs connect to the hips on the sides. Dinosaurs however have an upright stance: their legs connect to their hips straight under the body, just like birds – which makes sense because birds are dinosaurs!
I should add that the exact classification of dinosaurs and its subgroups are not entirely agreed on. So if you are a bit confused, you’re not alone. And if you, like all of the Jurassic Park/World franchise, want to call awesome, terrible, sometimes gigantic, extinct, reptile-like creatures by the name Dinosaur, I won’t stop you.
The teeniest dinosaur, but not really
For the fossil found in amber, however, the new fossil data (not yet published) apparently proves that it is not the teeniest dinosaur. Instead, it should be classified as a lizard, albeit an unusual one.
I could end there, but I want to mention one more controversy that I found while looking into this tiny dinosaur debacle, which brings up some of the ethics of fossil mining. These fossils were found in amber mines in Myanmar, mines that are situated in a military conflict zone and riddled with landmines. In addition, these amber mines mostly consist of long tunnels that are dangerous for the miners to work in, and many of the miners work under horrific and exploitatory conditions. You can read more about these ethical concerns here: http://markwitton-com.blogspot.com/2020/03/the-ugly-truth-behind-oculudentavis.html.
I’ve been gone, but that does not mean I haven’t been writing! I’ve been testing out some more comedic writing styles, which you can find published (!) on DNAtured (for science-related topics) and The Foreigner Blog (for non-related topics). You can read the here:
Some of the most efficient flying creatures in nature are flying insects. For the limited amount of neurons they have, they are incredibly competent in terms of locomotion, navigation, and maneuverability. For a roboticist working in microscale flight, creating an autonomous flying device as small, light, and versable as an insect is the dream.
Therefore, it is not a surprise that researchers study insects to improve their mini flying robots.
One example is a small quadcopter drone developed by Nakata et al., that inspired its collision avoidance system on the southern house mosquito. The researchers hypothesized that mosquitos actively sense sound and airflow specifically changes in the air patterns created by their wings as they move close to an obstacle.
Based on this system, the researchers designed a small drone that would sense an obstacle coming close, and automatically course-correct using this low-power sensing method.
Robotics + entomology = robontomology?
Creating flying robot-insects is not the only reason roboticists are interested in insects. The intersection between robotics and entomology can also be useful to better understand insect behavior.
For example, in an effort to answer the more basic question of how flying insects navigate in their environment, traditional methods proved to be quite limiting. Tethering an insect predictably interferes with flight, as does confining the insect to a room where tracking cameras can monitor their flight. In comes robotics: an open cage mount with an autonomous tracking camera (reactive controller), giving the flying insects free range to zoom, while being able to track the complex flying patterns of moths, fruit flies and mosquitos flying up to 3 m/s.
In other research, robot-insect hybrids can help understand insect brain function. By linking an insect brain to a small mechanical robot, the sensing response of different insects can be closely studied. For example, a Mantis-bot has been used to unravel the mechanism of mantis’ visual sensing and subsequent motor response.
The educational project BackYard Brains, which uses fun DIY experiments to explore the function of neurons and brains, also uses this robinsect approach to show how electrical impulses can control cockroach movement.
Okay, no, rather than having a insect-sized robot walking around and taking pictures, the researchers made a considerably lighter camera-backpack that beetles could walk around and take pictures with! A big bottleneck for insect-sized-robotics is that these gadgets require power, and batteries are kind of heavy. So by reducing the gadget to a steerable arm with a camera on it on the back of a beetle, rather than making a whole robot that needs to move around and maneuver, the researchers managed to cut down signficiantly on the weight.
Also, it’s cute as hell!
Thanks for the robot-insect update, Valerie. But what about the Killer Bees?
For the Black Mirror fans, not to worry, no-one is making swarms of bees (yet).
Sources and original research papers linked throughtout the text.
The first assumption is that everyone uses the same calendar: the Gregorian calendar.
In October 1582, Pope Gregory XIII introduced the Gregorian calendar as an update to the Julian calendar. Both are solar calendars, i.e. it starts counting when the sun moves through a fixed point, and a year would last ~365 days. This is different from a lunar calendar – based on the cycles of the moon in which a month (or moonth?) would be 28 days – that would not nicely sync up with the seasons.
In the Gregorian calendar, the astronomical cycle of the earth around the sun, which is 365.2425 days long, is taken into account by skipping a leap year every 100 years. Sort of; this approximation has an error of one day every 3,030 years, or 26 seconds a year, even with the skipping leap years every 100 year but not on the 400s*.
For most things, most of the world has adopted the Gregorian calendar for their daily life somewhere between 1582 and the early 20th century, even if cultural and religious calendars were kept in parallel.
If you think too much about it, months seem completely arbitrary – except maybe solstices landing on sort of the same date – and other systems, like the Equal Month Calendar which has 13 28-day months plus an extra day or two depending on leap years – sounds more plausible.
But for all intents and purposes, the whole world has agreed that the year starts on January 1st, based on the ideas of a Medieval Pope. And when decades start would depend on Christianity too.
The year 1 A.D.
Our current calendar starts counting from 1 A.D. (or Anno Domini), with the year 1 the year Jesus was allegedly born.
Allegedly, because it wasn’t until 525 A.D. that the year was set by Dionysius Exiguus when he was devising his method to calculate Easter. Historians believe that Jesus was actually born at least a few years earlier, and not necessarily on Christmas day. In any case, 1 A.D. has now generally been adopted as “the start of counting of years” and sometimes referred to as 1 C.E. (common era) to avoid religious connotations.
But the most interesting thing about 1 A.D. – for me at least – is that there is no year 0.
The Roman numeral system had no concept of zero, and it wasn’t until the eighth century that the Arabic Numeral for zero was introduced in Europe – and eventually used widely in the seventeenth century.
If that’s the case, did we really just start a new decade?
Counting from zero
A decade is simply “a span of 10 years,” so new decades are constantly starting. We don’t celebrate them typically, except for the decades of our lives, and those that are generally considered in the calendar.
In the 20th century, we started referring to decades as groups of years having the same digits: the years 1990-1999 are referred to as the nineties (dixit nineties kid) as opposed to counting from 1 to 10 (in which case the nineties would have been from 1991-2000). You would think this is the more “mathematically correct” way of counting, but even in programming, there are systems that start counting from 0 just as a convention.
Every 10 years, and definitely at the start of the new millennium, the same discussion occurs: when do we start counting a decade/century/millennium?
A poll from a month ago shows that most Americans (64%) saw the new decade start yesterday, while about a fifth (19%) were not sure. Only 17% answered the new decade will start on January 1, 2021.
In my opinion, it doesn’t really matter. Having a new decade start on a year ending with a 0 looks nicer, and in the 21st century means we can make fun novelty glasses where the lenses fit in the zeros. I’ll continue to be pedantic and say that the decade doesn’t start until next year if only so I can forget about it and not do that “looking back on a decade” thing, ever.
In any case, whether you think the year started yesterday, or the decade, or if it was just a regular old day, have a marvelous 2020!
*The rule is that every year that is divisible by four is a leap year, except for years that are divisible by 100, with the exception on that exception for years that are divisible by 400.
It’s happened to me. I’m sitting, calmly enjoying a sandwich outside on a bench, and then …
A seagull swoops in and tries to steal my food.
It’s terrifying. Seagulls are scary, especially up close and especially in Dundee, where I used to live and frequently sit outside eating something or the other. They have mean eyes.
I remember the seagulls in Dundee being quite peculiar. An anecdote: I was walking along the sidewalk, edging close to a corner where a seagull was digging through a ripped trash bag. When I was a few meters away, the seagull looked up and did this little walk away from the bag, pretending as if they weren’t just digging through trash. After I passed the corner, I glanced back and saw that they’d done a u-turn and went back to digging.
Okay, maybe I’m giving the bird too much of a personality. But it was weird.
Back to the food-stealing; a research conducted at the University of Exeter showed that if you stare at a gull, it is less likely to steal your chips (for US readers: french fries which are totally not from France but from Belgium and stop calling things the wrong name and, never mind, I’m okay).
Granted, the study had a limited scope. They tried to test 74 gulls, but more than half of them flew away. And it is likely that a lot of seagull related crime is due to a few bad seeds and most seagulls are perfectly happy leaving you and your food alone and digging through the trash for snacks.
Nevertheless, seagulls that were “looked at” while they were approaching food, were a lot less likely to touch that food. In fact, only a quarter of seagulls that were being watched while they tried to approach and eat food actually touched the food.
Maybe they were just scared of getting caught while committing food theft. Maybe they hate the color of our eyes. Maybe our stare is truly terrifying (I certainly know a few people with a scary stare). But next time you see a seagull approaching your food, give them the death stare. Perhaps your meal will be saved.
Quand le doigt montre le ciel, l’imbécile regarde le doigt.
For those who don’t speak French, or have never watched the fantastical modern fairy tale that is Amélie [in that case, stop reading and go watch it], this translates to: “When a finger is pointing up to the sky, only a fool looks at the finger.”
It’s not just fools; most animals would look at your finger and not the object that is being pointed at. Apparently, it is a rare trait to understand what pointing means.
Even though it is often considered rude to point – I surely remember being told that it was – it turns out that pointing is something very human.
What’s the point?
According to Michael Tomasello (Duke University), it all starts at the young age of 9 months.
Some time between being 9 and 12 months old, infants start pointing at things that they want or find interesting. While it is possible for some animals (we’ll get to that later) to look at the pointed-to object, infants understand that there is more to it.
There are different reasons to point. You can point to things that you want, like a cookie or a toy. You can point to things that you find interesting, like a dog or a toy. You can point to things that remind you of a shared experience, like a train or a toy. I guess I really like toys.
At a very young age, infants understand that pointing can be used to draw attention to something. The fact that pointing starts exhibiting itself at such a young age is an indication that it is – at least for some part – an evolved trait rather than learned. By creating a connection, and shared experiences, with another person, you start automatically pointing to things that refer to that shared experience – even before language is developed.
No matter where you travel, what language you speak, how old you are, pointing is universal. We understand that something pointed at is a request to share attention.
Get to the point
So toddlers know that when we point at something, we want them to look at it. While it is possible to teach chimpanzees – our closest cousins in the animal kingdom – to look at the object that is pointed at and to use pointing as a means to communicate, it takes a lot of conditioning. Most chimps fail the “pointing test”.
Dogs, however, pass easily. It seems that living with humans for centuries (millennia even), has led to dogs evolving to understand what pointing means.
Dogs have long been the prime example of understanding what pointing means. Our second-favorite-pet, however, was long considered to be untrainable and aloof. Until recently, when new studies have shown that cats can pass the pointing test – if they care to participate…
But cats that have a good connection with their owner, and spend a lot of playtime with them, often have the ability to not be the fool, and look at the object rather than the finger. It seems that again, shared experiences are crucial for pointing to work.
In any case, next time someone tells you that it’s rude to point, tell them that it’s human to point.
There are two identical twins. One of them travels through space in a high-speed rocket. When they return home, the Earth-bound twin has aged more. This is a result of special relativity. Very briefly, this is due to time slowing down as higher speeds are reached, and why Matthew McConaughey returned to Earth only to find his 90-something year old daughter on her dying bed.
This thought experiment has long been exactly that, a though experiment. But recently, we actually were able to learn what happens to twins when one is in space (granted, not in a high-speed rocket, but on the ISS) for almost a year, while the other twin stays on Earth.
Real Space Twinsies
On March 27 2015, astronaut Scott Kelly arrived at the International Space Station (ISS), while his brother, astronaut Mark Kelly, remained on Earth. (One can have a discussion on who was the luckier of the two.) They did the same activities, ate the same things, and followed the same schedule*, the only difference being that Scott was 400 km from the Earth’s surface, travelling at a speed of 7.66 km/s, while Mark was 0 km from the Earth surface, travelling at a speed of merely 460 m/second, as we all are.
340 days later, March 1 2016, Scott returned to Earth. For the full duration of his time on the ISS, as well as after his return, numerous samples were collected and tests were conducted to monitor his health and compare the physiological and biological changes that happened as a consequence of spacelife. Using his twin brother, a perfect genetic duplicate, as a control.
The effects of space
There are many “unusual” aspects about living in space, compared to living on Earth, including the odd noises of the ISS, the isolation (Scott was in contact with a mere 12 people during those 340 days), the ultra-controlled environment, a disruption of the normal body clock (imagine perpetually being jet-lagged because of constant switching of time zones), living in microgravity and the excess of radiation.
An ultra-combined effort, i.e. a major collaboration between a lot of different labs that looked at all possible aspects of physiological and biological function, the effects of 340 days in space (in this specific set of twins) was published last month. There are a lot of changes that occur to the human body in space, some more severe than others.
There are some changes that don’t really matter much, like changes in the gastrointestinal microbiome and changes in biomass, which were affected during Scott’s time in space, but rapidly returned to normal after he returned. Not much to worry about.
Mid-level risks included known effects of living in microgravity such changes in bone density (you don’t really need to use your skeletal muscles while floating around) and changes in how the heart pumps around blood (you don’t need to fight gravity to pump blood to the head). NASA already knows this and therefore has a rigorous rehabilitation program for returning astronauts to re-acclimatize to Earth’s gravity.
However, it’s the high-risk findings that we all have to worry about, which a mostly due to prolonged floating and prolonged radiation exposure. Due to changes in air pressure as well as that thing I mentioned about blood pumping, a lot of astronauts experience ocular issues after their return, a risk that only increases with increased dwell time off-Earth. This can severely compromise vision. There is also evidence of some cognitive decline. Both those aspects are worrying in the light of long term space travel, we would hope that space-explorers can see and think clearly while carrying out dangerous tasks in dangerous conditions. And that’s without considering a final severe risk…
Who’s the oldest twin?
In addition, the radiation that Scott experienced on ISS is pretty much equivalent to 50 years of normal exposure on Earth. This causes significant genomic instability and DNA damage, and consequentially an increased risk of developing cancer.
One example of this genomic instability has to do with telomeres**. Telomeres are bits of DNA that cap the end of chromosomes. Every time a cell divides, and in the process duplicates its whole DNA library, the telomeres get shorter. When they get too short, the cell can no longer divide. This is something that happens naturally during aging: shortening of telomeres phases out cells until they can no longer divide. Eventually, this leads to cell death.
1 year of space had an odd effect on Scott’s telomeres. Some of them grew longer, while others showed shortening. However, the lengthened telomere returned to normal after Scott’s landing on Earth, while the shortening persisted. So even though Scott was the space twin in our paradox, he seems to have ended up aging faster than Mark…
A lot happens to a body in space
Overall, the results are pretty surprising, prolonged living in space had more of an effect on the human body than researchers expected. And there is probably a lot more to learn, even just with the data collected from Scott and Mark.
On one hand, the twin study showed how resilient and robust the human body is. 91.3% of Scott’s gene expression levels returned to his baseline level within six months of landing, and some of the changes that occurred to his DNA and microbiome were no different than what occurs in high-stress situations on Earth. That’s amazing, the human body has not evolved to survive in space, but it seems to do pretty well considering how outlandish the conditions are!
On the other hand, the prolonged exposure to microgravity and high radiation does have severe effects on human health, leading to increased risk for compromised vision, cardiovascular disease, and cancer development. Even with the rigorous preparation and rehabilitation programs, astronauts go through before and after spaceflight, some of these effects will be impossible to avoid.
The massive study, combining the effort of 84 researchers in 12 different universities is a feat of collaboration (though nothing compared to the black hole telescope, if I’m honest) and it’s definitely a first that the genomes of space vs. Earth could be compared with a true genetic control. This compiled study, and the many pieces of research that are expected to be published in the next year with the results from the individual studies provide crucial insight on the effects of space in the long term. If we think that it takes approximately 1 year for a return journey to Mars, this research is valuable for the health of future astronauts and mankind’s ambition to explore further into space.
Want to know more? Watch NASA’s video on the three key findings, or read more in the Science paper or the NASA website (links below).
Markus Löbrich and Penny A. Jeggo. Hazards of human spaceflight. Science 364 (6436) p. 127-128. 2019. DOI: 10.1126/science.aaw7086
Francine E. Garrett-Bakelman, et al.The NASA Twins Study: A multidimensional analysis of a year-long human spaceflight. Science 364 (6436) eaau8650. 2019. DOI: 10.1126/science.aau8650
Cover image: The International Space Station crosses the terminator above the Gulf of Guinea, image credit NASA
*I remember reading this somewhere, but I cannot find the source anymore. It is thus possible that Mark just went about his normal life. Regardless, it is amazing that NASA had the opportunity to do this experiment with a perfect genetic control.
** Fun fact, my spelling check does not know the word “telomeres” and suggests that I mean “omelettes”. Well, I guess they both get super scrambled up in space? (Eeeeh for an inaccurate joke, sorry).